00:00:00.000 Started by upstream project "autotest-spdk-master-vs-dpdk-v23.11" build number 1064 00:00:00.000 originally caused by: 00:00:00.000 Started by upstream project "nightly-trigger" build number 3726 00:00:00.001 originally caused by: 00:00:00.001 Started by timer 00:00:00.053 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/nvme-vg-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-vg.groovy 00:00:00.054 The recommended git tool is: git 00:00:00.054 using credential 00000000-0000-0000-0000-000000000002 00:00:00.055 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/nvme-vg-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.090 Fetching changes from the remote Git repository 00:00:00.092 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.145 Using shallow fetch with depth 1 00:00:00.145 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.145 > git --version # timeout=10 00:00:00.184 > git --version # 'git version 2.39.2' 00:00:00.184 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.225 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.225 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:04.079 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:04.091 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:04.103 Checking out Revision db4637e8b949f278f369ec13f70585206ccd9507 (FETCH_HEAD) 00:00:04.103 > git config core.sparsecheckout # timeout=10 00:00:04.113 > git read-tree -mu HEAD # timeout=10 00:00:04.129 > git checkout -f db4637e8b949f278f369ec13f70585206ccd9507 # timeout=5 00:00:04.144 Commit message: "jenkins/jjb-config: Add missing SPDK_TEST_NVME_INTERRUPT flag" 00:00:04.144 > git rev-list --no-walk db4637e8b949f278f369ec13f70585206ccd9507 # timeout=10 00:00:04.232 [Pipeline] Start of Pipeline 00:00:04.248 [Pipeline] library 00:00:04.250 Loading library shm_lib@master 00:00:04.251 Library shm_lib@master is cached. Copying from home. 00:00:04.265 [Pipeline] node 00:00:04.287 Running on VM-host-SM38 in /var/jenkins/workspace/nvme-vg-autotest 00:00:04.288 [Pipeline] { 00:00:04.295 [Pipeline] catchError 00:00:04.296 [Pipeline] { 00:00:04.304 [Pipeline] wrap 00:00:04.310 [Pipeline] { 00:00:04.316 [Pipeline] stage 00:00:04.317 [Pipeline] { (Prologue) 00:00:04.330 [Pipeline] echo 00:00:04.331 Node: VM-host-SM38 00:00:04.335 [Pipeline] cleanWs 00:00:04.345 [WS-CLEANUP] Deleting project workspace... 00:00:04.345 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.354 [WS-CLEANUP] done 00:00:04.538 [Pipeline] setCustomBuildProperty 00:00:04.624 [Pipeline] httpRequest 00:00:05.192 [Pipeline] echo 00:00:05.194 Sorcerer 10.211.164.20 is alive 00:00:05.199 [Pipeline] retry 00:00:05.200 [Pipeline] { 00:00:05.208 [Pipeline] httpRequest 00:00:05.212 HttpMethod: GET 00:00:05.212 URL: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.212 Sending request to url: http://10.211.164.20/packages/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.215 Response Code: HTTP/1.1 200 OK 00:00:05.215 Success: Status code 200 is in the accepted range: 200,404 00:00:05.215 Saving response body to /var/jenkins/workspace/nvme-vg-autotest/jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.623 [Pipeline] } 00:00:05.638 [Pipeline] // retry 00:00:05.644 [Pipeline] sh 00:00:05.929 + tar --no-same-owner -xf jbp_db4637e8b949f278f369ec13f70585206ccd9507.tar.gz 00:00:05.942 [Pipeline] httpRequest 00:00:06.532 [Pipeline] echo 00:00:06.533 Sorcerer 10.211.164.20 is alive 00:00:06.539 [Pipeline] retry 00:00:06.540 [Pipeline] { 00:00:06.551 [Pipeline] httpRequest 00:00:06.555 HttpMethod: GET 00:00:06.556 URL: http://10.211.164.20/packages/spdk_e01cb43b8578f9155d07a9bc6eee4e70a3af96b0.tar.gz 00:00:06.557 Sending request to url: http://10.211.164.20/packages/spdk_e01cb43b8578f9155d07a9bc6eee4e70a3af96b0.tar.gz 00:00:06.572 Response Code: HTTP/1.1 200 OK 00:00:06.572 Success: Status code 200 is in the accepted range: 200,404 00:00:06.573 Saving response body to /var/jenkins/workspace/nvme-vg-autotest/spdk_e01cb43b8578f9155d07a9bc6eee4e70a3af96b0.tar.gz 00:00:42.498 [Pipeline] } 00:00:42.516 [Pipeline] // retry 00:00:42.523 [Pipeline] sh 00:00:42.811 + tar --no-same-owner -xf spdk_e01cb43b8578f9155d07a9bc6eee4e70a3af96b0.tar.gz 00:00:45.391 [Pipeline] sh 00:00:45.676 + git -C spdk log --oneline -n5 00:00:45.676 e01cb43b8 mk/spdk.common.mk sed the minor version 00:00:45.676 d58eef2a2 nvme/rdma: Fix reinserting qpair in connecting list after stale state 00:00:45.676 2104eacf0 test/check_so_deps: use VERSION to look for prior tags 00:00:45.676 66289a6db build: use VERSION file for storing version 00:00:45.676 626389917 nvme/rdma: Don't limit max_sge if UMR is used 00:00:45.697 [Pipeline] withCredentials 00:00:45.710 > git --version # timeout=10 00:00:45.722 > git --version # 'git version 2.39.2' 00:00:45.743 Masking supported pattern matches of $GIT_PASSWORD or $GIT_ASKPASS 00:00:45.745 [Pipeline] { 00:00:45.754 [Pipeline] retry 00:00:45.757 [Pipeline] { 00:00:45.772 [Pipeline] sh 00:00:46.058 + git ls-remote http://dpdk.org/git/dpdk-stable v23.11 00:00:46.071 [Pipeline] } 00:00:46.089 [Pipeline] // retry 00:00:46.094 [Pipeline] } 00:00:46.109 [Pipeline] // withCredentials 00:00:46.119 [Pipeline] httpRequest 00:00:46.511 [Pipeline] echo 00:00:46.513 Sorcerer 10.211.164.20 is alive 00:00:46.522 [Pipeline] retry 00:00:46.524 [Pipeline] { 00:00:46.538 [Pipeline] httpRequest 00:00:46.543 HttpMethod: GET 00:00:46.544 URL: http://10.211.164.20/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:00:46.544 Sending request to url: http://10.211.164.20/packages/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:00:46.560 Response Code: HTTP/1.1 200 OK 00:00:46.561 Success: Status code 200 is in the accepted range: 200,404 00:00:46.561 Saving response body to /var/jenkins/workspace/nvme-vg-autotest/dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:11.106 [Pipeline] } 00:01:11.123 [Pipeline] // retry 00:01:11.130 [Pipeline] sh 00:01:11.418 + tar --no-same-owner -xf dpdk_d15625009dced269fcec27fc81dd74fd58d54cdb.tar.gz 00:01:13.349 [Pipeline] sh 00:01:13.635 + git -C dpdk log --oneline -n5 00:01:13.636 eeb0605f11 version: 23.11.0 00:01:13.636 238778122a doc: update release notes for 23.11 00:01:13.636 46aa6b3cfc doc: fix description of RSS features 00:01:13.636 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:01:13.636 7e421ae345 devtools: support skipping forbid rule check 00:01:13.656 [Pipeline] writeFile 00:01:13.670 [Pipeline] sh 00:01:13.957 + jbp/jenkins/jjb-config/jobs/scripts/autorun_quirks.sh 00:01:13.970 [Pipeline] sh 00:01:14.255 + cat autorun-spdk.conf 00:01:14.255 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:14.255 SPDK_TEST_NVME=1 00:01:14.255 SPDK_TEST_FTL=1 00:01:14.255 SPDK_TEST_ISAL=1 00:01:14.255 SPDK_RUN_ASAN=1 00:01:14.255 SPDK_RUN_UBSAN=1 00:01:14.255 SPDK_TEST_XNVME=1 00:01:14.255 SPDK_TEST_NVME_FDP=1 00:01:14.255 SPDK_TEST_NATIVE_DPDK=v23.11 00:01:14.255 SPDK_RUN_EXTERNAL_DPDK=/home/vagrant/spdk_repo/dpdk/build 00:01:14.255 SPDK_ABI_DIR=/home/vagrant/spdk_repo/spdk-abi 00:01:14.264 RUN_NIGHTLY=1 00:01:14.267 [Pipeline] } 00:01:14.280 [Pipeline] // stage 00:01:14.294 [Pipeline] stage 00:01:14.296 [Pipeline] { (Run VM) 00:01:14.310 [Pipeline] sh 00:01:14.596 + jbp/jenkins/jjb-config/jobs/scripts/prepare_nvme.sh 00:01:14.596 + echo 'Start stage prepare_nvme.sh' 00:01:14.596 Start stage prepare_nvme.sh 00:01:14.596 + [[ -n 2 ]] 00:01:14.596 + disk_prefix=ex2 00:01:14.596 + [[ -n /var/jenkins/workspace/nvme-vg-autotest ]] 00:01:14.596 + [[ -e /var/jenkins/workspace/nvme-vg-autotest/autorun-spdk.conf ]] 00:01:14.596 + source /var/jenkins/workspace/nvme-vg-autotest/autorun-spdk.conf 00:01:14.596 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:14.596 ++ SPDK_TEST_NVME=1 00:01:14.596 ++ SPDK_TEST_FTL=1 00:01:14.596 ++ SPDK_TEST_ISAL=1 00:01:14.596 ++ SPDK_RUN_ASAN=1 00:01:14.596 ++ SPDK_RUN_UBSAN=1 00:01:14.596 ++ SPDK_TEST_XNVME=1 00:01:14.596 ++ SPDK_TEST_NVME_FDP=1 00:01:14.596 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:01:14.596 ++ SPDK_RUN_EXTERNAL_DPDK=/home/vagrant/spdk_repo/dpdk/build 00:01:14.596 ++ SPDK_ABI_DIR=/home/vagrant/spdk_repo/spdk-abi 00:01:14.596 ++ RUN_NIGHTLY=1 00:01:14.596 + cd /var/jenkins/workspace/nvme-vg-autotest 00:01:14.596 + nvme_files=() 00:01:14.596 + declare -A nvme_files 00:01:14.596 + backend_dir=/var/lib/libvirt/images/backends 00:01:14.596 + nvme_files['nvme.img']=5G 00:01:14.596 + nvme_files['nvme-cmb.img']=5G 00:01:14.596 + nvme_files['nvme-multi0.img']=4G 00:01:14.596 + nvme_files['nvme-multi1.img']=4G 00:01:14.596 + nvme_files['nvme-multi2.img']=4G 00:01:14.596 + nvme_files['nvme-openstack.img']=8G 00:01:14.596 + nvme_files['nvme-zns.img']=5G 00:01:14.596 + (( SPDK_TEST_NVME_PMR == 1 )) 00:01:14.596 + (( SPDK_TEST_FTL == 1 )) 00:01:14.596 + nvme_files["nvme-ftl.img"]=6G 00:01:14.596 + (( SPDK_TEST_NVME_FDP == 1 )) 00:01:14.596 + nvme_files["nvme-fdp.img"]=1G 00:01:14.596 + [[ ! -d /var/lib/libvirt/images/backends ]] 00:01:14.596 + for nvme in "${!nvme_files[@]}" 00:01:14.596 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex2-nvme-multi2.img -s 4G 00:01:14.859 Formatting '/var/lib/libvirt/images/backends/ex2-nvme-multi2.img', fmt=raw size=4294967296 preallocation=falloc 00:01:14.859 + for nvme in "${!nvme_files[@]}" 00:01:14.859 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex2-nvme-ftl.img -s 6G 00:01:15.822 Formatting '/var/lib/libvirt/images/backends/ex2-nvme-ftl.img', fmt=raw size=6442450944 preallocation=falloc 00:01:15.822 + for nvme in "${!nvme_files[@]}" 00:01:15.822 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex2-nvme-cmb.img -s 5G 00:01:15.822 Formatting '/var/lib/libvirt/images/backends/ex2-nvme-cmb.img', fmt=raw size=5368709120 preallocation=falloc 00:01:15.822 + for nvme in "${!nvme_files[@]}" 00:01:15.822 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex2-nvme-openstack.img -s 8G 00:01:15.822 Formatting '/var/lib/libvirt/images/backends/ex2-nvme-openstack.img', fmt=raw size=8589934592 preallocation=falloc 00:01:15.822 + for nvme in "${!nvme_files[@]}" 00:01:15.822 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex2-nvme-zns.img -s 5G 00:01:15.822 Formatting '/var/lib/libvirt/images/backends/ex2-nvme-zns.img', fmt=raw size=5368709120 preallocation=falloc 00:01:15.822 + for nvme in "${!nvme_files[@]}" 00:01:15.822 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex2-nvme-multi1.img -s 4G 00:01:16.393 Formatting '/var/lib/libvirt/images/backends/ex2-nvme-multi1.img', fmt=raw size=4294967296 preallocation=falloc 00:01:16.393 + for nvme in "${!nvme_files[@]}" 00:01:16.393 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex2-nvme-multi0.img -s 4G 00:01:17.340 Formatting '/var/lib/libvirt/images/backends/ex2-nvme-multi0.img', fmt=raw size=4294967296 preallocation=falloc 00:01:17.340 + for nvme in "${!nvme_files[@]}" 00:01:17.340 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex2-nvme-fdp.img -s 1G 00:01:17.340 Formatting '/var/lib/libvirt/images/backends/ex2-nvme-fdp.img', fmt=raw size=1073741824 preallocation=falloc 00:01:17.340 + for nvme in "${!nvme_files[@]}" 00:01:17.340 + sudo -E spdk/scripts/vagrant/create_nvme_img.sh -n /var/lib/libvirt/images/backends/ex2-nvme.img -s 5G 00:01:18.285 Formatting '/var/lib/libvirt/images/backends/ex2-nvme.img', fmt=raw size=5368709120 preallocation=falloc 00:01:18.285 ++ sudo grep -rl ex2-nvme.img /etc/libvirt/qemu 00:01:18.285 + echo 'End stage prepare_nvme.sh' 00:01:18.285 End stage prepare_nvme.sh 00:01:18.298 [Pipeline] sh 00:01:18.585 + DISTRO=fedora39 00:01:18.585 + CPUS=10 00:01:18.585 + RAM=12288 00:01:18.585 + jbp/jenkins/jjb-config/jobs/scripts/vagrant_create_vm.sh 00:01:18.585 Setup: -n 10 -s 12288 -x -p libvirt --qemu-emulator=/usr/local/qemu/vanilla-v8.0.0/bin/qemu-system-x86_64 --nic-model=e1000 -b /var/lib/libvirt/images/backends/ex2-nvme-ftl.img,nvme,,,,,true -b /var/lib/libvirt/images/backends/ex2-nvme.img -b /var/lib/libvirt/images/backends/ex2-nvme-multi0.img,nvme,/var/lib/libvirt/images/backends/ex2-nvme-multi1.img:/var/lib/libvirt/images/backends/ex2-nvme-multi2.img -b /var/lib/libvirt/images/backends/ex2-nvme-fdp.img,nvme,,,,,,on -H -a -v -f fedora39 00:01:18.585 00:01:18.585 DIR=/var/jenkins/workspace/nvme-vg-autotest/spdk/scripts/vagrant 00:01:18.585 SPDK_DIR=/var/jenkins/workspace/nvme-vg-autotest/spdk 00:01:18.585 VAGRANT_TARGET=/var/jenkins/workspace/nvme-vg-autotest 00:01:18.585 HELP=0 00:01:18.585 DRY_RUN=0 00:01:18.585 NVME_FILE=/var/lib/libvirt/images/backends/ex2-nvme-ftl.img,/var/lib/libvirt/images/backends/ex2-nvme.img,/var/lib/libvirt/images/backends/ex2-nvme-multi0.img,/var/lib/libvirt/images/backends/ex2-nvme-fdp.img, 00:01:18.585 NVME_DISKS_TYPE=nvme,nvme,nvme,nvme, 00:01:18.585 NVME_AUTO_CREATE=0 00:01:18.585 NVME_DISKS_NAMESPACES=,,/var/lib/libvirt/images/backends/ex2-nvme-multi1.img:/var/lib/libvirt/images/backends/ex2-nvme-multi2.img,, 00:01:18.585 NVME_CMB=,,,, 00:01:18.585 NVME_PMR=,,,, 00:01:18.585 NVME_ZNS=,,,, 00:01:18.585 NVME_MS=true,,,, 00:01:18.585 NVME_FDP=,,,on, 00:01:18.585 SPDK_VAGRANT_DISTRO=fedora39 00:01:18.585 SPDK_VAGRANT_VMCPU=10 00:01:18.585 SPDK_VAGRANT_VMRAM=12288 00:01:18.585 SPDK_VAGRANT_PROVIDER=libvirt 00:01:18.585 SPDK_VAGRANT_HTTP_PROXY= 00:01:18.585 SPDK_QEMU_EMULATOR=/usr/local/qemu/vanilla-v8.0.0/bin/qemu-system-x86_64 00:01:18.585 SPDK_OPENSTACK_NETWORK=0 00:01:18.585 VAGRANT_PACKAGE_BOX=0 00:01:18.585 VAGRANTFILE=/var/jenkins/workspace/nvme-vg-autotest/spdk/scripts/vagrant/Vagrantfile 00:01:18.585 FORCE_DISTRO=true 00:01:18.585 VAGRANT_BOX_VERSION= 00:01:18.585 EXTRA_VAGRANTFILES= 00:01:18.585 NIC_MODEL=e1000 00:01:18.585 00:01:18.585 mkdir: created directory '/var/jenkins/workspace/nvme-vg-autotest/fedora39-libvirt' 00:01:18.585 /var/jenkins/workspace/nvme-vg-autotest/fedora39-libvirt /var/jenkins/workspace/nvme-vg-autotest 00:01:21.137 Bringing machine 'default' up with 'libvirt' provider... 00:01:21.399 ==> default: Creating image (snapshot of base box volume). 00:01:21.660 ==> default: Creating domain with the following settings... 00:01:21.660 ==> default: -- Name: fedora39-39-1.5-1721788873-2326_default_1734238301_e14248995f982fe32c69 00:01:21.660 ==> default: -- Domain type: kvm 00:01:21.660 ==> default: -- Cpus: 10 00:01:21.660 ==> default: -- Feature: acpi 00:01:21.660 ==> default: -- Feature: apic 00:01:21.660 ==> default: -- Feature: pae 00:01:21.660 ==> default: -- Memory: 12288M 00:01:21.660 ==> default: -- Memory Backing: hugepages: 00:01:21.660 ==> default: -- Management MAC: 00:01:21.660 ==> default: -- Loader: 00:01:21.660 ==> default: -- Nvram: 00:01:21.660 ==> default: -- Base box: spdk/fedora39 00:01:21.660 ==> default: -- Storage pool: default 00:01:21.660 ==> default: -- Image: /var/lib/libvirt/images/fedora39-39-1.5-1721788873-2326_default_1734238301_e14248995f982fe32c69.img (20G) 00:01:21.660 ==> default: -- Volume Cache: default 00:01:21.660 ==> default: -- Kernel: 00:01:21.660 ==> default: -- Initrd: 00:01:21.660 ==> default: -- Graphics Type: vnc 00:01:21.660 ==> default: -- Graphics Port: -1 00:01:21.660 ==> default: -- Graphics IP: 127.0.0.1 00:01:21.660 ==> default: -- Graphics Password: Not defined 00:01:21.660 ==> default: -- Video Type: cirrus 00:01:21.661 ==> default: -- Video VRAM: 9216 00:01:21.661 ==> default: -- Sound Type: 00:01:21.661 ==> default: -- Keymap: en-us 00:01:21.661 ==> default: -- TPM Path: 00:01:21.661 ==> default: -- INPUT: type=mouse, bus=ps2 00:01:21.661 ==> default: -- Command line args: 00:01:21.661 ==> default: -> value=-device, 00:01:21.661 ==> default: -> value=nvme,id=nvme-0,serial=12340,addr=0x10, 00:01:21.661 ==> default: -> value=-drive, 00:01:21.661 ==> default: -> value=format=raw,file=/var/lib/libvirt/images/backends/ex2-nvme-ftl.img,if=none,id=nvme-0-drive0, 00:01:21.661 ==> default: -> value=-device, 00:01:21.661 ==> default: -> value=nvme-ns,drive=nvme-0-drive0,bus=nvme-0,nsid=1,zoned=false,logical_block_size=4096,physical_block_size=4096,ms=64, 00:01:21.661 ==> default: -> value=-device, 00:01:21.661 ==> default: -> value=nvme,id=nvme-1,serial=12341,addr=0x11, 00:01:21.661 ==> default: -> value=-drive, 00:01:21.661 ==> default: -> value=format=raw,file=/var/lib/libvirt/images/backends/ex2-nvme.img,if=none,id=nvme-1-drive0, 00:01:21.661 ==> default: -> value=-device, 00:01:21.661 ==> default: -> value=nvme-ns,drive=nvme-1-drive0,bus=nvme-1,nsid=1,zoned=false,logical_block_size=4096,physical_block_size=4096, 00:01:21.661 ==> default: -> value=-device, 00:01:21.661 ==> default: -> value=nvme,id=nvme-2,serial=12342,addr=0x12, 00:01:21.661 ==> default: -> value=-drive, 00:01:21.661 ==> default: -> value=format=raw,file=/var/lib/libvirt/images/backends/ex2-nvme-multi0.img,if=none,id=nvme-2-drive0, 00:01:21.661 ==> default: -> value=-device, 00:01:21.661 ==> default: -> value=nvme-ns,drive=nvme-2-drive0,bus=nvme-2,nsid=1,zoned=false,logical_block_size=4096,physical_block_size=4096, 00:01:21.661 ==> default: -> value=-drive, 00:01:21.661 ==> default: -> value=format=raw,file=/var/lib/libvirt/images/backends/ex2-nvme-multi1.img,if=none,id=nvme-2-drive1, 00:01:21.661 ==> default: -> value=-device, 00:01:21.661 ==> default: -> value=nvme-ns,drive=nvme-2-drive1,bus=nvme-2,nsid=2,zoned=false,logical_block_size=4096,physical_block_size=4096, 00:01:21.661 ==> default: -> value=-drive, 00:01:21.661 ==> default: -> value=format=raw,file=/var/lib/libvirt/images/backends/ex2-nvme-multi2.img,if=none,id=nvme-2-drive2, 00:01:21.661 ==> default: -> value=-device, 00:01:21.661 ==> default: -> value=nvme-ns,drive=nvme-2-drive2,bus=nvme-2,nsid=3,zoned=false,logical_block_size=4096,physical_block_size=4096, 00:01:21.661 ==> default: -> value=-device, 00:01:21.661 ==> default: -> value=nvme-subsys,id=fdp-subsys3,fdp=on,fdp.runs=96M,fdp.nrg=2,fdp.nruh=8, 00:01:21.661 ==> default: -> value=-device, 00:01:21.661 ==> default: -> value=nvme,id=nvme-3,serial=12343,addr=0x13,subsys=fdp-subsys3, 00:01:21.661 ==> default: -> value=-drive, 00:01:21.661 ==> default: -> value=format=raw,file=/var/lib/libvirt/images/backends/ex2-nvme-fdp.img,if=none,id=nvme-3-drive0, 00:01:21.661 ==> default: -> value=-device, 00:01:21.661 ==> default: -> value=nvme-ns,drive=nvme-3-drive0,bus=nvme-3,nsid=1,zoned=false,logical_block_size=4096,physical_block_size=4096, 00:01:21.922 ==> default: Creating shared folders metadata... 00:01:21.922 ==> default: Starting domain. 00:01:23.839 ==> default: Waiting for domain to get an IP address... 00:01:41.965 ==> default: Waiting for SSH to become available... 00:01:41.965 ==> default: Configuring and enabling network interfaces... 00:01:44.515 default: SSH address: 192.168.121.172:22 00:01:44.515 default: SSH username: vagrant 00:01:44.515 default: SSH auth method: private key 00:01:47.065 ==> default: Rsyncing folder: /mnt/jenkins_nvme/jenkins/workspace/nvme-vg-autotest/spdk/ => /home/vagrant/spdk_repo/spdk 00:01:55.297 ==> default: Rsyncing folder: /mnt/jenkins_nvme/jenkins/workspace/nvme-vg-autotest/dpdk/ => /home/vagrant/spdk_repo/dpdk 00:01:59.510 ==> default: Mounting SSHFS shared folder... 00:02:01.428 ==> default: Mounting folder via SSHFS: /mnt/jenkins_nvme/jenkins/workspace/nvme-vg-autotest/fedora39-libvirt/output => /home/vagrant/spdk_repo/output 00:02:01.428 ==> default: Checking Mount.. 00:02:02.812 ==> default: Folder Successfully Mounted! 00:02:02.812 00:02:02.812 SUCCESS! 00:02:02.812 00:02:02.812 cd to /var/jenkins/workspace/nvme-vg-autotest/fedora39-libvirt and type "vagrant ssh" to use. 00:02:02.812 Use vagrant "suspend" and vagrant "resume" to stop and start. 00:02:02.812 Use vagrant "destroy" followed by "rm -rf /var/jenkins/workspace/nvme-vg-autotest/fedora39-libvirt" to destroy all trace of vm. 00:02:02.812 00:02:02.820 [Pipeline] } 00:02:02.834 [Pipeline] // stage 00:02:02.844 [Pipeline] dir 00:02:02.844 Running in /var/jenkins/workspace/nvme-vg-autotest/fedora39-libvirt 00:02:02.846 [Pipeline] { 00:02:02.858 [Pipeline] catchError 00:02:02.860 [Pipeline] { 00:02:02.873 [Pipeline] sh 00:02:03.152 + vagrant ssh-config --host vagrant 00:02:03.152 + sed -ne '/^Host/,$p' 00:02:03.152 + tee ssh_conf 00:02:05.696 Host vagrant 00:02:05.696 HostName 192.168.121.172 00:02:05.696 User vagrant 00:02:05.696 Port 22 00:02:05.696 UserKnownHostsFile /dev/null 00:02:05.696 StrictHostKeyChecking no 00:02:05.696 PasswordAuthentication no 00:02:05.696 IdentityFile /var/lib/libvirt/images/.vagrant.d/boxes/spdk-VAGRANTSLASH-fedora39/39-1.5-1721788873-2326/libvirt/fedora39 00:02:05.696 IdentitiesOnly yes 00:02:05.696 LogLevel FATAL 00:02:05.696 ForwardAgent yes 00:02:05.696 ForwardX11 yes 00:02:05.696 00:02:05.715 [Pipeline] withEnv 00:02:05.717 [Pipeline] { 00:02:05.747 [Pipeline] sh 00:02:06.074 + /usr/local/bin/ssh -t -F ssh_conf vagrant@vagrant '#!/bin/bash 00:02:06.074 source /etc/os-release 00:02:06.074 [[ -e /image.version ]] && img=$(< /image.version) 00:02:06.074 # Minimal, systemd-like check. 00:02:06.074 if [[ -e /.dockerenv ]]; then 00:02:06.074 # Clear garbage from the node'\''s name: 00:02:06.074 # agt-er_autotest_547-896 -> autotest_547-896 00:02:06.074 # $HOSTNAME is the actual container id 00:02:06.074 agent=$HOSTNAME@${DOCKER_SWARM_PLUGIN_JENKINS_AGENT_NAME#*_} 00:02:06.074 if grep -q "/etc/hostname" /proc/self/mountinfo; then 00:02:06.074 # We can assume this is a mount from a host where container is running, 00:02:06.074 # so fetch its hostname to easily identify the target swarm worker. 00:02:06.074 container="$(< /etc/hostname) ($agent)" 00:02:06.074 else 00:02:06.074 # Fallback 00:02:06.074 container=$agent 00:02:06.074 fi 00:02:06.074 fi 00:02:06.074 echo "${NAME} ${VERSION_ID}|$(uname -r)|${img:-N/A}|${container:-N/A}" 00:02:06.074 ' 00:02:06.348 [Pipeline] } 00:02:06.365 [Pipeline] // withEnv 00:02:06.373 [Pipeline] setCustomBuildProperty 00:02:06.388 [Pipeline] stage 00:02:06.390 [Pipeline] { (Tests) 00:02:06.406 [Pipeline] sh 00:02:06.691 + scp -F ssh_conf -r /var/jenkins/workspace/nvme-vg-autotest/jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh vagrant@vagrant:./ 00:02:06.969 [Pipeline] sh 00:02:07.256 + scp -F ssh_conf -r /var/jenkins/workspace/nvme-vg-autotest/jbp/jenkins/jjb-config/jobs/scripts/pkgdep-autoruner.sh vagrant@vagrant:./ 00:02:07.535 [Pipeline] timeout 00:02:07.535 Timeout set to expire in 50 min 00:02:07.537 [Pipeline] { 00:02:07.553 [Pipeline] sh 00:02:07.838 + /usr/local/bin/ssh -t -F ssh_conf vagrant@vagrant 'git -C spdk_repo/spdk reset --hard' 00:02:08.412 HEAD is now at e01cb43b8 mk/spdk.common.mk sed the minor version 00:02:08.427 [Pipeline] sh 00:02:08.713 + /usr/local/bin/ssh -t -F ssh_conf vagrant@vagrant 'sudo chown vagrant:vagrant spdk_repo' 00:02:08.993 [Pipeline] sh 00:02:09.281 + scp -F ssh_conf -r /var/jenkins/workspace/nvme-vg-autotest/autorun-spdk.conf vagrant@vagrant:spdk_repo 00:02:09.561 [Pipeline] sh 00:02:09.839 + /usr/local/bin/ssh -t -F ssh_conf vagrant@vagrant 'JOB_BASE_NAME=nvme-vg-autotest ./autoruner.sh spdk_repo' 00:02:10.098 ++ readlink -f spdk_repo 00:02:10.098 + DIR_ROOT=/home/vagrant/spdk_repo 00:02:10.098 + [[ -n /home/vagrant/spdk_repo ]] 00:02:10.098 + DIR_SPDK=/home/vagrant/spdk_repo/spdk 00:02:10.098 + DIR_OUTPUT=/home/vagrant/spdk_repo/output 00:02:10.098 + [[ -d /home/vagrant/spdk_repo/spdk ]] 00:02:10.098 + [[ ! -d /home/vagrant/spdk_repo/output ]] 00:02:10.098 + [[ -d /home/vagrant/spdk_repo/output ]] 00:02:10.098 + [[ nvme-vg-autotest == pkgdep-* ]] 00:02:10.098 + cd /home/vagrant/spdk_repo 00:02:10.098 + source /etc/os-release 00:02:10.098 ++ NAME='Fedora Linux' 00:02:10.098 ++ VERSION='39 (Cloud Edition)' 00:02:10.098 ++ ID=fedora 00:02:10.098 ++ VERSION_ID=39 00:02:10.098 ++ VERSION_CODENAME= 00:02:10.098 ++ PLATFORM_ID=platform:f39 00:02:10.098 ++ PRETTY_NAME='Fedora Linux 39 (Cloud Edition)' 00:02:10.098 ++ ANSI_COLOR='0;38;2;60;110;180' 00:02:10.098 ++ LOGO=fedora-logo-icon 00:02:10.098 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:39 00:02:10.098 ++ HOME_URL=https://fedoraproject.org/ 00:02:10.098 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/ 00:02:10.098 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:02:10.098 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:02:10.098 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:02:10.098 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=39 00:02:10.098 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:02:10.098 ++ REDHAT_SUPPORT_PRODUCT_VERSION=39 00:02:10.098 ++ SUPPORT_END=2024-11-12 00:02:10.098 ++ VARIANT='Cloud Edition' 00:02:10.098 ++ VARIANT_ID=cloud 00:02:10.098 + uname -a 00:02:10.098 Linux fedora39-cloud-1721788873-2326 6.8.9-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Jul 24 03:04:40 UTC 2024 x86_64 GNU/Linux 00:02:10.098 + sudo /home/vagrant/spdk_repo/spdk/scripts/setup.sh status 00:02:10.358 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:02:10.618 Hugepages 00:02:10.618 node hugesize free / total 00:02:10.618 node0 1048576kB 0 / 0 00:02:10.618 node0 2048kB 0 / 0 00:02:10.618 00:02:10.618 Type BDF Vendor Device NUMA Driver Device Block devices 00:02:10.618 virtio 0000:00:03.0 1af4 1001 unknown virtio-pci - vda 00:02:10.618 NVMe 0000:00:10.0 1b36 0010 unknown nvme nvme3 nvme3n1 00:02:10.618 NVMe 0000:00:11.0 1b36 0010 unknown nvme nvme0 nvme0n1 00:02:10.618 NVMe 0000:00:12.0 1b36 0010 unknown nvme nvme1 nvme1n1 nvme1n2 nvme1n3 00:02:10.618 NVMe 0000:00:13.0 1b36 0010 unknown nvme nvme2 nvme2n1 00:02:10.879 + rm -f /tmp/spdk-ld-path 00:02:10.879 + source autorun-spdk.conf 00:02:10.879 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:10.879 ++ SPDK_TEST_NVME=1 00:02:10.879 ++ SPDK_TEST_FTL=1 00:02:10.879 ++ SPDK_TEST_ISAL=1 00:02:10.879 ++ SPDK_RUN_ASAN=1 00:02:10.879 ++ SPDK_RUN_UBSAN=1 00:02:10.879 ++ SPDK_TEST_XNVME=1 00:02:10.879 ++ SPDK_TEST_NVME_FDP=1 00:02:10.879 ++ SPDK_TEST_NATIVE_DPDK=v23.11 00:02:10.879 ++ SPDK_RUN_EXTERNAL_DPDK=/home/vagrant/spdk_repo/dpdk/build 00:02:10.879 ++ SPDK_ABI_DIR=/home/vagrant/spdk_repo/spdk-abi 00:02:10.879 ++ RUN_NIGHTLY=1 00:02:10.879 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:02:10.879 + [[ -n '' ]] 00:02:10.879 + sudo git config --global --add safe.directory /home/vagrant/spdk_repo/spdk 00:02:10.879 + for M in /var/spdk/build-*-manifest.txt 00:02:10.879 + [[ -f /var/spdk/build-kernel-manifest.txt ]] 00:02:10.879 + cp /var/spdk/build-kernel-manifest.txt /home/vagrant/spdk_repo/output/ 00:02:10.879 + for M in /var/spdk/build-*-manifest.txt 00:02:10.879 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:02:10.879 + cp /var/spdk/build-pkg-manifest.txt /home/vagrant/spdk_repo/output/ 00:02:10.879 + for M in /var/spdk/build-*-manifest.txt 00:02:10.879 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:02:10.879 + cp /var/spdk/build-repo-manifest.txt /home/vagrant/spdk_repo/output/ 00:02:10.879 ++ uname 00:02:10.879 + [[ Linux == \L\i\n\u\x ]] 00:02:10.879 + sudo dmesg -T 00:02:10.879 + sudo dmesg --clear 00:02:10.879 + dmesg_pid=5757 00:02:10.879 + [[ Fedora Linux == FreeBSD ]] 00:02:10.879 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:10.879 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:02:10.879 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:02:10.879 + [[ -x /usr/src/fio-static/fio ]] 00:02:10.879 + sudo dmesg -Tw 00:02:10.879 + export FIO_BIN=/usr/src/fio-static/fio 00:02:10.879 + FIO_BIN=/usr/src/fio-static/fio 00:02:10.879 + [[ '' == \/\q\e\m\u\_\v\f\i\o\/* ]] 00:02:10.879 + [[ ! -v VFIO_QEMU_BIN ]] 00:02:10.879 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:02:10.879 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:10.879 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:02:10.879 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:02:10.879 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:10.879 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:02:10.879 + spdk/autorun.sh /home/vagrant/spdk_repo/autorun-spdk.conf 00:02:10.879 04:52:30 -- common/autotest_common.sh@1710 -- $ [[ n == y ]] 00:02:10.879 04:52:30 -- spdk/autorun.sh@20 -- $ source /home/vagrant/spdk_repo/autorun-spdk.conf 00:02:10.879 04:52:30 -- spdk_repo/autorun-spdk.conf@1 -- $ SPDK_RUN_FUNCTIONAL_TEST=1 00:02:10.879 04:52:30 -- spdk_repo/autorun-spdk.conf@2 -- $ SPDK_TEST_NVME=1 00:02:10.879 04:52:30 -- spdk_repo/autorun-spdk.conf@3 -- $ SPDK_TEST_FTL=1 00:02:10.879 04:52:30 -- spdk_repo/autorun-spdk.conf@4 -- $ SPDK_TEST_ISAL=1 00:02:10.879 04:52:30 -- spdk_repo/autorun-spdk.conf@5 -- $ SPDK_RUN_ASAN=1 00:02:10.879 04:52:30 -- spdk_repo/autorun-spdk.conf@6 -- $ SPDK_RUN_UBSAN=1 00:02:10.879 04:52:30 -- spdk_repo/autorun-spdk.conf@7 -- $ SPDK_TEST_XNVME=1 00:02:10.879 04:52:30 -- spdk_repo/autorun-spdk.conf@8 -- $ SPDK_TEST_NVME_FDP=1 00:02:10.879 04:52:30 -- spdk_repo/autorun-spdk.conf@9 -- $ SPDK_TEST_NATIVE_DPDK=v23.11 00:02:10.879 04:52:30 -- spdk_repo/autorun-spdk.conf@10 -- $ SPDK_RUN_EXTERNAL_DPDK=/home/vagrant/spdk_repo/dpdk/build 00:02:10.879 04:52:30 -- spdk_repo/autorun-spdk.conf@11 -- $ SPDK_ABI_DIR=/home/vagrant/spdk_repo/spdk-abi 00:02:10.879 04:52:30 -- spdk_repo/autorun-spdk.conf@12 -- $ RUN_NIGHTLY=1 00:02:10.879 04:52:30 -- spdk/autorun.sh@22 -- $ trap 'timing_finish || exit 1' EXIT 00:02:10.879 04:52:30 -- spdk/autorun.sh@25 -- $ /home/vagrant/spdk_repo/spdk/autobuild.sh /home/vagrant/spdk_repo/autorun-spdk.conf 00:02:10.879 04:52:31 -- common/autotest_common.sh@1710 -- $ [[ n == y ]] 00:02:10.879 04:52:31 -- common/autobuild_common.sh@15 -- $ source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:02:10.879 04:52:31 -- scripts/common.sh@15 -- $ shopt -s extglob 00:02:10.879 04:52:31 -- scripts/common.sh@544 -- $ [[ -e /bin/wpdk_common.sh ]] 00:02:10.879 04:52:31 -- scripts/common.sh@552 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:10.879 04:52:31 -- scripts/common.sh@553 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:10.879 04:52:31 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/vagrant/.local/bin:/home/vagrant/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:10.879 04:52:31 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/vagrant/.local/bin:/home/vagrant/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:10.879 04:52:31 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/vagrant/.local/bin:/home/vagrant/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:10.879 04:52:31 -- paths/export.sh@5 -- $ export PATH 00:02:10.879 04:52:31 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/vagrant/.local/bin:/home/vagrant/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:10.879 04:52:31 -- common/autobuild_common.sh@492 -- $ out=/home/vagrant/spdk_repo/spdk/../output 00:02:10.879 04:52:31 -- common/autobuild_common.sh@493 -- $ date +%s 00:02:10.879 04:52:31 -- common/autobuild_common.sh@493 -- $ mktemp -dt spdk_1734238351.XXXXXX 00:02:11.140 04:52:31 -- common/autobuild_common.sh@493 -- $ SPDK_WORKSPACE=/tmp/spdk_1734238351.Iz0xNh 00:02:11.140 04:52:31 -- common/autobuild_common.sh@495 -- $ [[ -n '' ]] 00:02:11.140 04:52:31 -- common/autobuild_common.sh@499 -- $ '[' -n v23.11 ']' 00:02:11.140 04:52:31 -- common/autobuild_common.sh@500 -- $ dirname /home/vagrant/spdk_repo/dpdk/build 00:02:11.140 04:52:31 -- common/autobuild_common.sh@500 -- $ scanbuild_exclude=' --exclude /home/vagrant/spdk_repo/dpdk' 00:02:11.140 04:52:31 -- common/autobuild_common.sh@506 -- $ scanbuild_exclude+=' --exclude /home/vagrant/spdk_repo/spdk/xnvme --exclude /tmp' 00:02:11.140 04:52:31 -- common/autobuild_common.sh@508 -- $ scanbuild='scan-build -o /home/vagrant/spdk_repo/spdk/../output/scan-build-tmp --exclude /home/vagrant/spdk_repo/dpdk --exclude /home/vagrant/spdk_repo/spdk/xnvme --exclude /tmp --status-bugs' 00:02:11.140 04:52:31 -- common/autobuild_common.sh@509 -- $ get_config_params 00:02:11.140 04:52:31 -- common/autotest_common.sh@409 -- $ xtrace_disable 00:02:11.140 04:52:31 -- common/autotest_common.sh@10 -- $ set +x 00:02:11.140 04:52:31 -- common/autobuild_common.sh@509 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-asan --enable-coverage --with-ublk --with-dpdk=/home/vagrant/spdk_repo/dpdk/build --with-xnvme' 00:02:11.140 04:52:31 -- common/autobuild_common.sh@511 -- $ start_monitor_resources 00:02:11.140 04:52:31 -- pm/common@17 -- $ local monitor 00:02:11.140 04:52:31 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.140 04:52:31 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:11.140 04:52:31 -- pm/common@25 -- $ sleep 1 00:02:11.140 04:52:31 -- pm/common@21 -- $ date +%s 00:02:11.140 04:52:31 -- pm/common@21 -- $ date +%s 00:02:11.140 04:52:31 -- pm/common@21 -- $ /home/vagrant/spdk_repo/spdk/scripts/perf/pm/collect-cpu-load -d /home/vagrant/spdk_repo/spdk/../output/power -l -p monitor.autobuild.sh.1734238351 00:02:11.140 04:52:31 -- pm/common@21 -- $ /home/vagrant/spdk_repo/spdk/scripts/perf/pm/collect-vmstat -d /home/vagrant/spdk_repo/spdk/../output/power -l -p monitor.autobuild.sh.1734238351 00:02:11.140 Redirecting to /home/vagrant/spdk_repo/spdk/../output/power/monitor.autobuild.sh.1734238351_collect-cpu-load.pm.log 00:02:11.140 Redirecting to /home/vagrant/spdk_repo/spdk/../output/power/monitor.autobuild.sh.1734238351_collect-vmstat.pm.log 00:02:12.083 04:52:32 -- common/autobuild_common.sh@512 -- $ trap stop_monitor_resources EXIT 00:02:12.083 04:52:32 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:02:12.083 04:52:32 -- spdk/autobuild.sh@12 -- $ umask 022 00:02:12.083 04:52:32 -- spdk/autobuild.sh@13 -- $ cd /home/vagrant/spdk_repo/spdk 00:02:12.083 04:52:32 -- spdk/autobuild.sh@16 -- $ date -u 00:02:12.083 Sun Dec 15 04:52:32 AM UTC 2024 00:02:12.083 04:52:32 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:02:12.083 v25.01-rc1-2-ge01cb43b8 00:02:12.083 04:52:32 -- spdk/autobuild.sh@19 -- $ '[' 1 -eq 1 ']' 00:02:12.083 04:52:32 -- spdk/autobuild.sh@20 -- $ run_test asan echo 'using asan' 00:02:12.083 04:52:32 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:02:12.083 04:52:32 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:12.084 04:52:32 -- common/autotest_common.sh@10 -- $ set +x 00:02:12.084 ************************************ 00:02:12.084 START TEST asan 00:02:12.084 ************************************ 00:02:12.084 using asan 00:02:12.084 04:52:32 asan -- common/autotest_common.sh@1129 -- $ echo 'using asan' 00:02:12.084 00:02:12.084 real 0m0.000s 00:02:12.084 user 0m0.000s 00:02:12.084 sys 0m0.000s 00:02:12.084 ************************************ 00:02:12.084 END TEST asan 00:02:12.084 ************************************ 00:02:12.084 04:52:32 asan -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:12.084 04:52:32 asan -- common/autotest_common.sh@10 -- $ set +x 00:02:12.084 04:52:32 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:02:12.084 04:52:32 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:02:12.084 04:52:32 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:02:12.084 04:52:32 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:12.084 04:52:32 -- common/autotest_common.sh@10 -- $ set +x 00:02:12.084 ************************************ 00:02:12.084 START TEST ubsan 00:02:12.084 ************************************ 00:02:12.084 using ubsan 00:02:12.084 04:52:32 ubsan -- common/autotest_common.sh@1129 -- $ echo 'using ubsan' 00:02:12.084 00:02:12.084 real 0m0.000s 00:02:12.084 user 0m0.000s 00:02:12.084 sys 0m0.000s 00:02:12.084 04:52:32 ubsan -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:12.084 04:52:32 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:02:12.084 ************************************ 00:02:12.084 END TEST ubsan 00:02:12.084 ************************************ 00:02:12.084 04:52:32 -- spdk/autobuild.sh@27 -- $ '[' -n v23.11 ']' 00:02:12.084 04:52:32 -- spdk/autobuild.sh@28 -- $ build_native_dpdk 00:02:12.084 04:52:32 -- common/autobuild_common.sh@449 -- $ run_test build_native_dpdk _build_native_dpdk 00:02:12.084 04:52:32 -- common/autotest_common.sh@1105 -- $ '[' 2 -le 1 ']' 00:02:12.084 04:52:32 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:02:12.084 04:52:32 -- common/autotest_common.sh@10 -- $ set +x 00:02:12.084 ************************************ 00:02:12.084 START TEST build_native_dpdk 00:02:12.084 ************************************ 00:02:12.084 04:52:32 build_native_dpdk -- common/autotest_common.sh@1129 -- $ _build_native_dpdk 00:02:12.084 04:52:32 build_native_dpdk -- common/autobuild_common.sh@48 -- $ local external_dpdk_dir 00:02:12.084 04:52:32 build_native_dpdk -- common/autobuild_common.sh@49 -- $ local external_dpdk_base_dir 00:02:12.084 04:52:32 build_native_dpdk -- common/autobuild_common.sh@50 -- $ local compiler_version 00:02:12.084 04:52:32 build_native_dpdk -- common/autobuild_common.sh@51 -- $ local compiler 00:02:12.084 04:52:32 build_native_dpdk -- common/autobuild_common.sh@52 -- $ local dpdk_kmods 00:02:12.084 04:52:32 build_native_dpdk -- common/autobuild_common.sh@53 -- $ local repo=dpdk 00:02:12.084 04:52:32 build_native_dpdk -- common/autobuild_common.sh@55 -- $ compiler=gcc 00:02:12.084 04:52:32 build_native_dpdk -- common/autobuild_common.sh@61 -- $ export CC=gcc 00:02:12.084 04:52:32 build_native_dpdk -- common/autobuild_common.sh@61 -- $ CC=gcc 00:02:12.084 04:52:32 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *clang* ]] 00:02:12.084 04:52:32 build_native_dpdk -- common/autobuild_common.sh@63 -- $ [[ gcc != *gcc* ]] 00:02:12.084 04:52:32 build_native_dpdk -- common/autobuild_common.sh@68 -- $ gcc -dumpversion 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@68 -- $ compiler_version=13 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@69 -- $ compiler_version=13 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@70 -- $ external_dpdk_dir=/home/vagrant/spdk_repo/dpdk/build 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@71 -- $ dirname /home/vagrant/spdk_repo/dpdk/build 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@71 -- $ external_dpdk_base_dir=/home/vagrant/spdk_repo/dpdk 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@73 -- $ [[ ! -d /home/vagrant/spdk_repo/dpdk ]] 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@82 -- $ orgdir=/home/vagrant/spdk_repo/spdk 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@83 -- $ git -C /home/vagrant/spdk_repo/dpdk log --oneline -n 5 00:02:12.345 eeb0605f11 version: 23.11.0 00:02:12.345 238778122a doc: update release notes for 23.11 00:02:12.345 46aa6b3cfc doc: fix description of RSS features 00:02:12.345 dd88f51a57 devtools: forbid DPDK API in cnxk base driver 00:02:12.345 7e421ae345 devtools: support skipping forbid rule check 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@85 -- $ dpdk_cflags='-fPIC -g -fcommon' 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@86 -- $ dpdk_ldflags= 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@87 -- $ dpdk_ver=23.11.0 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ gcc == *gcc* ]] 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@89 -- $ [[ 13 -ge 5 ]] 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@90 -- $ dpdk_cflags+=' -Werror' 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ gcc == *gcc* ]] 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@93 -- $ [[ 13 -ge 10 ]] 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@94 -- $ dpdk_cflags+=' -Wno-stringop-overflow' 00:02:12.345 04:52:32 build_native_dpdk -- common/autobuild_common.sh@102 -- $ DPDK_DRIVERS=("bus" "bus/pci" "bus/vdev" "mempool/ring" "net/i40e" "net/i40e/base" "power/acpi" "power/amd_pstate" "power/cppc" "power/intel_pstate" "power/intel_uncore" "power/kvm_vm") 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@103 -- $ local mlx5_libs_added=n 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@104 -- $ [[ 0 -eq 1 ]] 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@104 -- $ [[ 0 -eq 1 ]] 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@146 -- $ [[ 0 -eq 1 ]] 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@174 -- $ cd /home/vagrant/spdk_repo/dpdk 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@175 -- $ uname -s 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@175 -- $ '[' Linux = Linux ']' 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@176 -- $ lt 23.11.0 21.11.0 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '<' 21.11.0 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 21 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@353 -- $ local d=21 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 21 =~ ^[0-9]+$ ]] 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@355 -- $ echo 21 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=21 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@367 -- $ return 1 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@180 -- $ patch -p1 00:02:12.346 patching file config/rte_config.h 00:02:12.346 Hunk #1 succeeded at 60 (offset 1 line). 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@183 -- $ lt 23.11.0 24.07.0 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@373 -- $ cmp_versions 23.11.0 '<' 24.07.0 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=<' 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@345 -- $ : 1 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@368 -- $ return 0 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@184 -- $ patch -p1 00:02:12.346 patching file lib/pcapng/rte_pcapng.c 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@186 -- $ ge 23.11.0 24.07.0 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@376 -- $ cmp_versions 23.11.0 '>=' 24.07.0 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@333 -- $ local ver1 ver1_l 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@334 -- $ local ver2 ver2_l 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@336 -- $ IFS=.-: 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@336 -- $ read -ra ver1 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@337 -- $ IFS=.-: 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@337 -- $ read -ra ver2 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@338 -- $ local 'op=>=' 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@340 -- $ ver1_l=3 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@341 -- $ ver2_l=3 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@343 -- $ local lt=0 gt=0 eq=0 v 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@344 -- $ case "$op" in 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@348 -- $ : 1 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@364 -- $ (( v = 0 )) 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@364 -- $ (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@365 -- $ decimal 23 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@353 -- $ local d=23 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 23 =~ ^[0-9]+$ ]] 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@355 -- $ echo 23 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@365 -- $ ver1[v]=23 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@366 -- $ decimal 24 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@353 -- $ local d=24 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@354 -- $ [[ 24 =~ ^[0-9]+$ ]] 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@355 -- $ echo 24 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@366 -- $ ver2[v]=24 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@367 -- $ (( ver1[v] > ver2[v] )) 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@368 -- $ (( ver1[v] < ver2[v] )) 00:02:12.346 04:52:32 build_native_dpdk -- scripts/common.sh@368 -- $ return 1 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@190 -- $ dpdk_kmods=false 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@191 -- $ uname -s 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@191 -- $ '[' Linux = FreeBSD ']' 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@195 -- $ printf %s, bus bus/pci bus/vdev mempool/ring net/i40e net/i40e/base power/acpi power/amd_pstate power/cppc power/intel_pstate power/intel_uncore power/kvm_vm 00:02:12.346 04:52:32 build_native_dpdk -- common/autobuild_common.sh@195 -- $ meson build-tmp --prefix=/home/vagrant/spdk_repo/dpdk/build --libdir lib -Denable_docs=false -Denable_kmods=false -Dtests=false -Dc_link_args= '-Dc_args=-fPIC -g -fcommon -Werror -Wno-stringop-overflow' -Dmachine=native -Denable_drivers=bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,power/acpi,power/amd_pstate,power/cppc,power/intel_pstate,power/intel_uncore,power/kvm_vm, 00:02:17.643 The Meson build system 00:02:17.643 Version: 1.5.0 00:02:17.643 Source dir: /home/vagrant/spdk_repo/dpdk 00:02:17.643 Build dir: /home/vagrant/spdk_repo/dpdk/build-tmp 00:02:17.643 Build type: native build 00:02:17.643 Program cat found: YES (/usr/bin/cat) 00:02:17.643 Project name: DPDK 00:02:17.643 Project version: 23.11.0 00:02:17.643 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:02:17.643 C linker for the host machine: gcc ld.bfd 2.40-14 00:02:17.643 Host machine cpu family: x86_64 00:02:17.643 Host machine cpu: x86_64 00:02:17.643 Message: ## Building in Developer Mode ## 00:02:17.643 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:17.643 Program check-symbols.sh found: YES (/home/vagrant/spdk_repo/dpdk/buildtools/check-symbols.sh) 00:02:17.643 Program options-ibverbs-static.sh found: YES (/home/vagrant/spdk_repo/dpdk/buildtools/options-ibverbs-static.sh) 00:02:17.643 Program python3 found: YES (/usr/bin/python3) 00:02:17.643 Program cat found: YES (/usr/bin/cat) 00:02:17.643 config/meson.build:113: WARNING: The "machine" option is deprecated. Please use "cpu_instruction_set" instead. 00:02:17.643 Compiler for C supports arguments -march=native: YES 00:02:17.643 Checking for size of "void *" : 8 00:02:17.643 Checking for size of "void *" : 8 (cached) 00:02:17.643 Library m found: YES 00:02:17.643 Library numa found: YES 00:02:17.643 Has header "numaif.h" : YES 00:02:17.643 Library fdt found: NO 00:02:17.643 Library execinfo found: NO 00:02:17.643 Has header "execinfo.h" : YES 00:02:17.643 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:02:17.643 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:17.643 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:17.643 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:17.643 Run-time dependency openssl found: YES 3.1.1 00:02:17.643 Run-time dependency libpcap found: YES 1.10.4 00:02:17.643 Has header "pcap.h" with dependency libpcap: YES 00:02:17.643 Compiler for C supports arguments -Wcast-qual: YES 00:02:17.643 Compiler for C supports arguments -Wdeprecated: YES 00:02:17.643 Compiler for C supports arguments -Wformat: YES 00:02:17.643 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:17.643 Compiler for C supports arguments -Wformat-security: NO 00:02:17.643 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:17.643 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:17.643 Compiler for C supports arguments -Wnested-externs: YES 00:02:17.643 Compiler for C supports arguments -Wold-style-definition: YES 00:02:17.643 Compiler for C supports arguments -Wpointer-arith: YES 00:02:17.643 Compiler for C supports arguments -Wsign-compare: YES 00:02:17.643 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:17.643 Compiler for C supports arguments -Wundef: YES 00:02:17.643 Compiler for C supports arguments -Wwrite-strings: YES 00:02:17.643 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:17.643 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:17.643 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:17.643 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:17.643 Program objdump found: YES (/usr/bin/objdump) 00:02:17.643 Compiler for C supports arguments -mavx512f: YES 00:02:17.643 Checking if "AVX512 checking" compiles: YES 00:02:17.643 Fetching value of define "__SSE4_2__" : 1 00:02:17.643 Fetching value of define "__AES__" : 1 00:02:17.643 Fetching value of define "__AVX__" : 1 00:02:17.643 Fetching value of define "__AVX2__" : 1 00:02:17.643 Fetching value of define "__AVX512BW__" : 1 00:02:17.643 Fetching value of define "__AVX512CD__" : 1 00:02:17.643 Fetching value of define "__AVX512DQ__" : 1 00:02:17.643 Fetching value of define "__AVX512F__" : 1 00:02:17.643 Fetching value of define "__AVX512VL__" : 1 00:02:17.643 Fetching value of define "__PCLMUL__" : 1 00:02:17.643 Fetching value of define "__RDRND__" : 1 00:02:17.643 Fetching value of define "__RDSEED__" : 1 00:02:17.643 Fetching value of define "__VPCLMULQDQ__" : 1 00:02:17.643 Fetching value of define "__znver1__" : (undefined) 00:02:17.643 Fetching value of define "__znver2__" : (undefined) 00:02:17.643 Fetching value of define "__znver3__" : (undefined) 00:02:17.643 Fetching value of define "__znver4__" : (undefined) 00:02:17.643 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:17.643 Message: lib/log: Defining dependency "log" 00:02:17.643 Message: lib/kvargs: Defining dependency "kvargs" 00:02:17.643 Message: lib/telemetry: Defining dependency "telemetry" 00:02:17.643 Checking for function "getentropy" : NO 00:02:17.643 Message: lib/eal: Defining dependency "eal" 00:02:17.643 Message: lib/ring: Defining dependency "ring" 00:02:17.643 Message: lib/rcu: Defining dependency "rcu" 00:02:17.643 Message: lib/mempool: Defining dependency "mempool" 00:02:17.643 Message: lib/mbuf: Defining dependency "mbuf" 00:02:17.643 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:17.643 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:17.643 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:17.643 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:17.643 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:17.643 Fetching value of define "__VPCLMULQDQ__" : 1 (cached) 00:02:17.643 Compiler for C supports arguments -mpclmul: YES 00:02:17.643 Compiler for C supports arguments -maes: YES 00:02:17.643 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:17.643 Compiler for C supports arguments -mavx512bw: YES 00:02:17.643 Compiler for C supports arguments -mavx512dq: YES 00:02:17.643 Compiler for C supports arguments -mavx512vl: YES 00:02:17.643 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:17.643 Compiler for C supports arguments -mavx2: YES 00:02:17.643 Compiler for C supports arguments -mavx: YES 00:02:17.643 Message: lib/net: Defining dependency "net" 00:02:17.643 Message: lib/meter: Defining dependency "meter" 00:02:17.643 Message: lib/ethdev: Defining dependency "ethdev" 00:02:17.643 Message: lib/pci: Defining dependency "pci" 00:02:17.643 Message: lib/cmdline: Defining dependency "cmdline" 00:02:17.643 Message: lib/metrics: Defining dependency "metrics" 00:02:17.643 Message: lib/hash: Defining dependency "hash" 00:02:17.643 Message: lib/timer: Defining dependency "timer" 00:02:17.643 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:17.643 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:17.643 Fetching value of define "__AVX512CD__" : 1 (cached) 00:02:17.643 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:17.643 Message: lib/acl: Defining dependency "acl" 00:02:17.643 Message: lib/bbdev: Defining dependency "bbdev" 00:02:17.644 Message: lib/bitratestats: Defining dependency "bitratestats" 00:02:17.644 Run-time dependency libelf found: YES 0.191 00:02:17.644 Message: lib/bpf: Defining dependency "bpf" 00:02:17.644 Message: lib/cfgfile: Defining dependency "cfgfile" 00:02:17.644 Message: lib/compressdev: Defining dependency "compressdev" 00:02:17.644 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:17.644 Message: lib/distributor: Defining dependency "distributor" 00:02:17.644 Message: lib/dmadev: Defining dependency "dmadev" 00:02:17.644 Message: lib/efd: Defining dependency "efd" 00:02:17.644 Message: lib/eventdev: Defining dependency "eventdev" 00:02:17.644 Message: lib/dispatcher: Defining dependency "dispatcher" 00:02:17.644 Message: lib/gpudev: Defining dependency "gpudev" 00:02:17.644 Message: lib/gro: Defining dependency "gro" 00:02:17.644 Message: lib/gso: Defining dependency "gso" 00:02:17.644 Message: lib/ip_frag: Defining dependency "ip_frag" 00:02:17.644 Message: lib/jobstats: Defining dependency "jobstats" 00:02:17.644 Message: lib/latencystats: Defining dependency "latencystats" 00:02:17.644 Message: lib/lpm: Defining dependency "lpm" 00:02:17.644 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:17.644 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:17.644 Fetching value of define "__AVX512IFMA__" : 1 00:02:17.644 Message: lib/member: Defining dependency "member" 00:02:17.644 Message: lib/pcapng: Defining dependency "pcapng" 00:02:17.644 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:17.644 Message: lib/power: Defining dependency "power" 00:02:17.644 Message: lib/rawdev: Defining dependency "rawdev" 00:02:17.644 Message: lib/regexdev: Defining dependency "regexdev" 00:02:17.644 Message: lib/mldev: Defining dependency "mldev" 00:02:17.644 Message: lib/rib: Defining dependency "rib" 00:02:17.644 Message: lib/reorder: Defining dependency "reorder" 00:02:17.644 Message: lib/sched: Defining dependency "sched" 00:02:17.644 Message: lib/security: Defining dependency "security" 00:02:17.644 Message: lib/stack: Defining dependency "stack" 00:02:17.644 Has header "linux/userfaultfd.h" : YES 00:02:17.644 Has header "linux/vduse.h" : YES 00:02:17.644 Message: lib/vhost: Defining dependency "vhost" 00:02:17.644 Message: lib/ipsec: Defining dependency "ipsec" 00:02:17.644 Message: lib/pdcp: Defining dependency "pdcp" 00:02:17.644 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:17.644 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:17.644 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:17.644 Message: lib/fib: Defining dependency "fib" 00:02:17.644 Message: lib/port: Defining dependency "port" 00:02:17.644 Message: lib/pdump: Defining dependency "pdump" 00:02:17.644 Message: lib/table: Defining dependency "table" 00:02:17.644 Message: lib/pipeline: Defining dependency "pipeline" 00:02:17.644 Message: lib/graph: Defining dependency "graph" 00:02:17.644 Message: lib/node: Defining dependency "node" 00:02:17.644 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:17.644 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:17.644 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:17.644 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:18.218 Compiler for C supports arguments -Wno-sign-compare: YES 00:02:18.218 Compiler for C supports arguments -Wno-unused-value: YES 00:02:18.218 Compiler for C supports arguments -Wno-format: YES 00:02:18.218 Compiler for C supports arguments -Wno-format-security: YES 00:02:18.218 Compiler for C supports arguments -Wno-format-nonliteral: YES 00:02:18.218 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:02:18.218 Compiler for C supports arguments -Wno-unused-but-set-variable: YES 00:02:18.218 Compiler for C supports arguments -Wno-unused-parameter: YES 00:02:18.218 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:18.218 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:18.218 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:18.218 Compiler for C supports arguments -mavx512bw: YES (cached) 00:02:18.218 Compiler for C supports arguments -march=skylake-avx512: YES 00:02:18.218 Message: drivers/net/i40e: Defining dependency "net_i40e" 00:02:18.218 Has header "sys/epoll.h" : YES 00:02:18.218 Program doxygen found: YES (/usr/local/bin/doxygen) 00:02:18.218 Configuring doxy-api-html.conf using configuration 00:02:18.218 Configuring doxy-api-man.conf using configuration 00:02:18.218 Program mandb found: YES (/usr/bin/mandb) 00:02:18.218 Program sphinx-build found: NO 00:02:18.218 Configuring rte_build_config.h using configuration 00:02:18.218 Message: 00:02:18.218 ================= 00:02:18.218 Applications Enabled 00:02:18.218 ================= 00:02:18.218 00:02:18.218 apps: 00:02:18.218 dumpcap, graph, pdump, proc-info, test-acl, test-bbdev, test-cmdline, test-compress-perf, 00:02:18.218 test-crypto-perf, test-dma-perf, test-eventdev, test-fib, test-flow-perf, test-gpudev, test-mldev, test-pipeline, 00:02:18.218 test-pmd, test-regex, test-sad, test-security-perf, 00:02:18.218 00:02:18.218 Message: 00:02:18.218 ================= 00:02:18.218 Libraries Enabled 00:02:18.218 ================= 00:02:18.218 00:02:18.218 libs: 00:02:18.218 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:18.218 net, meter, ethdev, pci, cmdline, metrics, hash, timer, 00:02:18.218 acl, bbdev, bitratestats, bpf, cfgfile, compressdev, cryptodev, distributor, 00:02:18.218 dmadev, efd, eventdev, dispatcher, gpudev, gro, gso, ip_frag, 00:02:18.218 jobstats, latencystats, lpm, member, pcapng, power, rawdev, regexdev, 00:02:18.218 mldev, rib, reorder, sched, security, stack, vhost, ipsec, 00:02:18.218 pdcp, fib, port, pdump, table, pipeline, graph, node, 00:02:18.218 00:02:18.218 00:02:18.218 Message: 00:02:18.218 =============== 00:02:18.218 Drivers Enabled 00:02:18.218 =============== 00:02:18.218 00:02:18.218 common: 00:02:18.218 00:02:18.218 bus: 00:02:18.218 pci, vdev, 00:02:18.218 mempool: 00:02:18.218 ring, 00:02:18.218 dma: 00:02:18.218 00:02:18.218 net: 00:02:18.218 i40e, 00:02:18.218 raw: 00:02:18.218 00:02:18.218 crypto: 00:02:18.218 00:02:18.218 compress: 00:02:18.218 00:02:18.218 regex: 00:02:18.218 00:02:18.218 ml: 00:02:18.218 00:02:18.218 vdpa: 00:02:18.218 00:02:18.218 event: 00:02:18.218 00:02:18.218 baseband: 00:02:18.218 00:02:18.218 gpu: 00:02:18.218 00:02:18.218 00:02:18.218 Message: 00:02:18.218 ================= 00:02:18.218 Content Skipped 00:02:18.218 ================= 00:02:18.218 00:02:18.218 apps: 00:02:18.218 00:02:18.218 libs: 00:02:18.218 00:02:18.218 drivers: 00:02:18.218 common/cpt: not in enabled drivers build config 00:02:18.218 common/dpaax: not in enabled drivers build config 00:02:18.218 common/iavf: not in enabled drivers build config 00:02:18.218 common/idpf: not in enabled drivers build config 00:02:18.218 common/mvep: not in enabled drivers build config 00:02:18.218 common/octeontx: not in enabled drivers build config 00:02:18.218 bus/auxiliary: not in enabled drivers build config 00:02:18.218 bus/cdx: not in enabled drivers build config 00:02:18.218 bus/dpaa: not in enabled drivers build config 00:02:18.218 bus/fslmc: not in enabled drivers build config 00:02:18.218 bus/ifpga: not in enabled drivers build config 00:02:18.218 bus/platform: not in enabled drivers build config 00:02:18.218 bus/vmbus: not in enabled drivers build config 00:02:18.218 common/cnxk: not in enabled drivers build config 00:02:18.218 common/mlx5: not in enabled drivers build config 00:02:18.218 common/nfp: not in enabled drivers build config 00:02:18.218 common/qat: not in enabled drivers build config 00:02:18.218 common/sfc_efx: not in enabled drivers build config 00:02:18.218 mempool/bucket: not in enabled drivers build config 00:02:18.218 mempool/cnxk: not in enabled drivers build config 00:02:18.218 mempool/dpaa: not in enabled drivers build config 00:02:18.218 mempool/dpaa2: not in enabled drivers build config 00:02:18.218 mempool/octeontx: not in enabled drivers build config 00:02:18.219 mempool/stack: not in enabled drivers build config 00:02:18.219 dma/cnxk: not in enabled drivers build config 00:02:18.219 dma/dpaa: not in enabled drivers build config 00:02:18.219 dma/dpaa2: not in enabled drivers build config 00:02:18.219 dma/hisilicon: not in enabled drivers build config 00:02:18.219 dma/idxd: not in enabled drivers build config 00:02:18.219 dma/ioat: not in enabled drivers build config 00:02:18.219 dma/skeleton: not in enabled drivers build config 00:02:18.219 net/af_packet: not in enabled drivers build config 00:02:18.219 net/af_xdp: not in enabled drivers build config 00:02:18.219 net/ark: not in enabled drivers build config 00:02:18.219 net/atlantic: not in enabled drivers build config 00:02:18.219 net/avp: not in enabled drivers build config 00:02:18.219 net/axgbe: not in enabled drivers build config 00:02:18.219 net/bnx2x: not in enabled drivers build config 00:02:18.219 net/bnxt: not in enabled drivers build config 00:02:18.219 net/bonding: not in enabled drivers build config 00:02:18.219 net/cnxk: not in enabled drivers build config 00:02:18.219 net/cpfl: not in enabled drivers build config 00:02:18.219 net/cxgbe: not in enabled drivers build config 00:02:18.219 net/dpaa: not in enabled drivers build config 00:02:18.219 net/dpaa2: not in enabled drivers build config 00:02:18.219 net/e1000: not in enabled drivers build config 00:02:18.219 net/ena: not in enabled drivers build config 00:02:18.219 net/enetc: not in enabled drivers build config 00:02:18.219 net/enetfec: not in enabled drivers build config 00:02:18.219 net/enic: not in enabled drivers build config 00:02:18.219 net/failsafe: not in enabled drivers build config 00:02:18.219 net/fm10k: not in enabled drivers build config 00:02:18.219 net/gve: not in enabled drivers build config 00:02:18.219 net/hinic: not in enabled drivers build config 00:02:18.219 net/hns3: not in enabled drivers build config 00:02:18.219 net/iavf: not in enabled drivers build config 00:02:18.219 net/ice: not in enabled drivers build config 00:02:18.219 net/idpf: not in enabled drivers build config 00:02:18.219 net/igc: not in enabled drivers build config 00:02:18.219 net/ionic: not in enabled drivers build config 00:02:18.219 net/ipn3ke: not in enabled drivers build config 00:02:18.219 net/ixgbe: not in enabled drivers build config 00:02:18.219 net/mana: not in enabled drivers build config 00:02:18.219 net/memif: not in enabled drivers build config 00:02:18.219 net/mlx4: not in enabled drivers build config 00:02:18.219 net/mlx5: not in enabled drivers build config 00:02:18.219 net/mvneta: not in enabled drivers build config 00:02:18.219 net/mvpp2: not in enabled drivers build config 00:02:18.219 net/netvsc: not in enabled drivers build config 00:02:18.219 net/nfb: not in enabled drivers build config 00:02:18.219 net/nfp: not in enabled drivers build config 00:02:18.219 net/ngbe: not in enabled drivers build config 00:02:18.219 net/null: not in enabled drivers build config 00:02:18.219 net/octeontx: not in enabled drivers build config 00:02:18.219 net/octeon_ep: not in enabled drivers build config 00:02:18.219 net/pcap: not in enabled drivers build config 00:02:18.219 net/pfe: not in enabled drivers build config 00:02:18.219 net/qede: not in enabled drivers build config 00:02:18.219 net/ring: not in enabled drivers build config 00:02:18.219 net/sfc: not in enabled drivers build config 00:02:18.219 net/softnic: not in enabled drivers build config 00:02:18.219 net/tap: not in enabled drivers build config 00:02:18.219 net/thunderx: not in enabled drivers build config 00:02:18.219 net/txgbe: not in enabled drivers build config 00:02:18.219 net/vdev_netvsc: not in enabled drivers build config 00:02:18.219 net/vhost: not in enabled drivers build config 00:02:18.219 net/virtio: not in enabled drivers build config 00:02:18.219 net/vmxnet3: not in enabled drivers build config 00:02:18.219 raw/cnxk_bphy: not in enabled drivers build config 00:02:18.219 raw/cnxk_gpio: not in enabled drivers build config 00:02:18.219 raw/dpaa2_cmdif: not in enabled drivers build config 00:02:18.219 raw/ifpga: not in enabled drivers build config 00:02:18.219 raw/ntb: not in enabled drivers build config 00:02:18.219 raw/skeleton: not in enabled drivers build config 00:02:18.219 crypto/armv8: not in enabled drivers build config 00:02:18.219 crypto/bcmfs: not in enabled drivers build config 00:02:18.219 crypto/caam_jr: not in enabled drivers build config 00:02:18.219 crypto/ccp: not in enabled drivers build config 00:02:18.219 crypto/cnxk: not in enabled drivers build config 00:02:18.219 crypto/dpaa_sec: not in enabled drivers build config 00:02:18.219 crypto/dpaa2_sec: not in enabled drivers build config 00:02:18.219 crypto/ipsec_mb: not in enabled drivers build config 00:02:18.219 crypto/mlx5: not in enabled drivers build config 00:02:18.219 crypto/mvsam: not in enabled drivers build config 00:02:18.219 crypto/nitrox: not in enabled drivers build config 00:02:18.219 crypto/null: not in enabled drivers build config 00:02:18.219 crypto/octeontx: not in enabled drivers build config 00:02:18.219 crypto/openssl: not in enabled drivers build config 00:02:18.219 crypto/scheduler: not in enabled drivers build config 00:02:18.219 crypto/uadk: not in enabled drivers build config 00:02:18.219 crypto/virtio: not in enabled drivers build config 00:02:18.219 compress/isal: not in enabled drivers build config 00:02:18.219 compress/mlx5: not in enabled drivers build config 00:02:18.219 compress/octeontx: not in enabled drivers build config 00:02:18.219 compress/zlib: not in enabled drivers build config 00:02:18.219 regex/mlx5: not in enabled drivers build config 00:02:18.219 regex/cn9k: not in enabled drivers build config 00:02:18.219 ml/cnxk: not in enabled drivers build config 00:02:18.219 vdpa/ifc: not in enabled drivers build config 00:02:18.219 vdpa/mlx5: not in enabled drivers build config 00:02:18.219 vdpa/nfp: not in enabled drivers build config 00:02:18.219 vdpa/sfc: not in enabled drivers build config 00:02:18.219 event/cnxk: not in enabled drivers build config 00:02:18.219 event/dlb2: not in enabled drivers build config 00:02:18.219 event/dpaa: not in enabled drivers build config 00:02:18.219 event/dpaa2: not in enabled drivers build config 00:02:18.219 event/dsw: not in enabled drivers build config 00:02:18.219 event/opdl: not in enabled drivers build config 00:02:18.219 event/skeleton: not in enabled drivers build config 00:02:18.219 event/sw: not in enabled drivers build config 00:02:18.219 event/octeontx: not in enabled drivers build config 00:02:18.219 baseband/acc: not in enabled drivers build config 00:02:18.219 baseband/fpga_5gnr_fec: not in enabled drivers build config 00:02:18.219 baseband/fpga_lte_fec: not in enabled drivers build config 00:02:18.219 baseband/la12xx: not in enabled drivers build config 00:02:18.219 baseband/null: not in enabled drivers build config 00:02:18.219 baseband/turbo_sw: not in enabled drivers build config 00:02:18.219 gpu/cuda: not in enabled drivers build config 00:02:18.219 00:02:18.219 00:02:18.219 Build targets in project: 215 00:02:18.219 00:02:18.219 DPDK 23.11.0 00:02:18.219 00:02:18.219 User defined options 00:02:18.219 libdir : lib 00:02:18.219 prefix : /home/vagrant/spdk_repo/dpdk/build 00:02:18.219 c_args : -fPIC -g -fcommon -Werror -Wno-stringop-overflow 00:02:18.219 c_link_args : 00:02:18.219 enable_docs : false 00:02:18.219 enable_drivers: bus,bus/pci,bus/vdev,mempool/ring,net/i40e,net/i40e/base,power/acpi,power/amd_pstate,power/cppc,power/intel_pstate,power/intel_uncore,power/kvm_vm, 00:02:18.219 enable_kmods : false 00:02:18.219 machine : native 00:02:18.219 tests : false 00:02:18.219 00:02:18.219 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:18.219 WARNING: Running the setup command as `meson [options]` instead of `meson setup [options]` is ambiguous and deprecated. 00:02:18.481 04:52:38 build_native_dpdk -- common/autobuild_common.sh@199 -- $ ninja -C /home/vagrant/spdk_repo/dpdk/build-tmp -j10 00:02:18.481 ninja: Entering directory `/home/vagrant/spdk_repo/dpdk/build-tmp' 00:02:18.481 [1/705] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:18.481 [2/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:18.481 [3/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:18.743 [4/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:18.743 [5/705] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:18.743 [6/705] Linking static target lib/librte_kvargs.a 00:02:18.743 [7/705] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:18.743 [8/705] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:18.743 [9/705] Linking static target lib/librte_log.a 00:02:18.743 [10/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:19.005 [11/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:19.005 [12/705] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.005 [13/705] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:19.005 [14/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:19.005 [15/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:19.005 [16/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:19.265 [17/705] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.265 [18/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:19.265 [19/705] Linking target lib/librte_log.so.24.0 00:02:19.265 [20/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:19.265 [21/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:19.265 [22/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:19.265 [23/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:19.265 [24/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:19.265 [25/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:19.523 [26/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:19.523 [27/705] Generating symbol file lib/librte_log.so.24.0.p/librte_log.so.24.0.symbols 00:02:19.523 [28/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:19.523 [29/705] Linking target lib/librte_kvargs.so.24.0 00:02:19.523 [30/705] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:19.523 [31/705] Linking static target lib/librte_telemetry.a 00:02:19.523 [32/705] Generating symbol file lib/librte_kvargs.so.24.0.p/librte_kvargs.so.24.0.symbols 00:02:19.523 [33/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:19.523 [34/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:19.782 [35/705] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:19.782 [36/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:19.782 [37/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:19.782 [38/705] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:19.782 [39/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:19.782 [40/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:19.782 [41/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:19.782 [42/705] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:19.782 [43/705] Linking target lib/librte_telemetry.so.24.0 00:02:19.782 [44/705] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:20.041 [45/705] Generating symbol file lib/librte_telemetry.so.24.0.p/librte_telemetry.so.24.0.symbols 00:02:20.041 [46/705] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:20.041 [47/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:20.041 [48/705] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:20.300 [49/705] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:20.300 [50/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:20.300 [51/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:20.300 [52/705] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:20.300 [53/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:20.300 [54/705] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:20.300 [55/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:20.300 [56/705] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:20.300 [57/705] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:20.300 [58/705] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:20.558 [59/705] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:20.558 [60/705] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:20.558 [61/705] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:20.558 [62/705] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:20.558 [63/705] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:20.558 [64/705] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:20.558 [65/705] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:20.558 [66/705] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:20.558 [67/705] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:20.558 [68/705] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:20.818 [69/705] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:20.818 [70/705] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:20.818 [71/705] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:20.818 [72/705] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:20.818 [73/705] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:20.818 [74/705] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:20.818 [75/705] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:20.818 [76/705] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:20.819 [77/705] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:20.819 [78/705] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:21.080 [79/705] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:21.080 [80/705] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:21.342 [81/705] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:21.342 [82/705] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:21.342 [83/705] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:21.342 [84/705] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:21.342 [85/705] Linking static target lib/librte_ring.a 00:02:21.342 [86/705] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:21.342 [87/705] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:21.602 [88/705] Linking static target lib/librte_eal.a 00:02:21.602 [89/705] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:21.602 [90/705] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:21.602 [91/705] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:21.602 [92/705] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:21.602 [93/705] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:21.602 [94/705] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:21.602 [95/705] Linking static target lib/librte_mempool.a 00:02:21.862 [96/705] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:21.862 [97/705] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:21.862 [98/705] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:21.862 [99/705] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:21.862 [100/705] Linking static target lib/librte_rcu.a 00:02:21.862 [101/705] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:21.862 [102/705] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:22.123 [103/705] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:22.123 [104/705] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.123 [105/705] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:22.123 [106/705] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.123 [107/705] Linking static target lib/librte_meter.a 00:02:22.393 [108/705] Compiling C object lib/librte_net.a.p/net_net_crc_avx512.c.o 00:02:22.393 [109/705] Linking static target lib/librte_net.a 00:02:22.393 [110/705] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.393 [111/705] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:22.393 [112/705] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:22.393 [113/705] Linking static target lib/librte_mbuf.a 00:02:22.393 [114/705] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:22.393 [115/705] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:22.393 [116/705] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.393 [117/705] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:22.654 [118/705] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:22.654 [119/705] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:22.916 [120/705] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:23.177 [121/705] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:23.177 [122/705] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:23.177 [123/705] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:23.177 [124/705] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:23.177 [125/705] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:23.177 [126/705] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:23.177 [127/705] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:23.177 [128/705] Linking static target lib/librte_pci.a 00:02:23.177 [129/705] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:23.177 [130/705] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:23.439 [131/705] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:23.439 [132/705] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:23.439 [133/705] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:23.439 [134/705] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:23.439 [135/705] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.439 [136/705] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:23.439 [137/705] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:23.439 [138/705] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:23.439 [139/705] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:23.439 [140/705] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:23.439 [141/705] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:23.700 [142/705] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:23.700 [143/705] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:23.700 [144/705] Linking static target lib/librte_cmdline.a 00:02:23.700 [145/705] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:23.700 [146/705] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:23.700 [147/705] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics_telemetry.c.o 00:02:23.961 [148/705] Compiling C object lib/librte_metrics.a.p/metrics_rte_metrics.c.o 00:02:23.961 [149/705] Linking static target lib/librte_metrics.a 00:02:24.222 [150/705] Generating lib/metrics.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.222 [151/705] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.222 [152/705] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:24.222 [153/705] Linking static target lib/librte_timer.a 00:02:24.222 [154/705] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:24.483 [155/705] Compiling C object lib/librte_acl.a.p/acl_acl_gen.c.o 00:02:24.483 [156/705] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.483 [157/705] Compiling C object lib/librte_acl.a.p/acl_tb_mem.c.o 00:02:24.483 [158/705] Compiling C object lib/librte_acl.a.p/acl_rte_acl.c.o 00:02:24.745 [159/705] Compiling C object lib/librte_acl.a.p/acl_acl_run_scalar.c.o 00:02:25.006 [160/705] Compiling C object lib/librte_acl.a.p/acl_acl_bld.c.o 00:02:25.006 [161/705] Compiling C object lib/librte_bitratestats.a.p/bitratestats_rte_bitrate.c.o 00:02:25.006 [162/705] Linking static target lib/librte_bitratestats.a 00:02:25.007 [163/705] Compiling C object lib/librte_bpf.a.p/bpf_bpf.c.o 00:02:25.007 [164/705] Generating lib/bitratestats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.268 [165/705] Compiling C object lib/librte_bpf.a.p/bpf_bpf_dump.c.o 00:02:25.268 [166/705] Compiling C object lib/librte_bbdev.a.p/bbdev_rte_bbdev.c.o 00:02:25.268 [167/705] Linking static target lib/librte_bbdev.a 00:02:25.268 [168/705] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:25.268 [169/705] Linking static target lib/librte_hash.a 00:02:25.268 [170/705] Compiling C object lib/acl/libavx2_tmp.a.p/acl_run_avx2.c.o 00:02:25.268 [171/705] Linking static target lib/acl/libavx2_tmp.a 00:02:25.530 [172/705] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load.c.o 00:02:25.530 [173/705] Compiling C object lib/librte_bpf.a.p/bpf_bpf_stub.c.o 00:02:25.530 [174/705] Compiling C object lib/librte_bpf.a.p/bpf_bpf_exec.c.o 00:02:25.530 [175/705] Generating lib/bbdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.792 [176/705] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.792 [177/705] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:25.792 [178/705] Compiling C object lib/librte_acl.a.p/acl_acl_run_sse.c.o 00:02:25.792 [179/705] Linking static target lib/librte_ethdev.a 00:02:25.792 [180/705] Compiling C object lib/librte_bpf.a.p/bpf_bpf_pkt.c.o 00:02:25.792 [181/705] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.792 [182/705] Compiling C object lib/librte_cfgfile.a.p/cfgfile_rte_cfgfile.c.o 00:02:25.792 [183/705] Linking static target lib/librte_cfgfile.a 00:02:25.792 [184/705] Linking target lib/librte_eal.so.24.0 00:02:26.054 [185/705] Compiling C object lib/librte_bpf.a.p/bpf_bpf_load_elf.c.o 00:02:26.054 [186/705] Compiling C object lib/librte_bpf.a.p/bpf_bpf_convert.c.o 00:02:26.054 [187/705] Generating symbol file lib/librte_eal.so.24.0.p/librte_eal.so.24.0.symbols 00:02:26.054 [188/705] Linking target lib/librte_ring.so.24.0 00:02:26.054 [189/705] Generating lib/cfgfile.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.054 [190/705] Generating symbol file lib/librte_ring.so.24.0.p/librte_ring.so.24.0.symbols 00:02:26.054 [191/705] Linking target lib/librte_meter.so.24.0 00:02:26.315 [192/705] Linking target lib/librte_rcu.so.24.0 00:02:26.315 [193/705] Linking target lib/librte_mempool.so.24.0 00:02:26.315 [194/705] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:26.315 [195/705] Generating symbol file lib/librte_rcu.so.24.0.p/librte_rcu.so.24.0.symbols 00:02:26.315 [196/705] Generating symbol file lib/librte_meter.so.24.0.p/librte_meter.so.24.0.symbols 00:02:26.315 [197/705] Generating symbol file lib/librte_mempool.so.24.0.p/librte_mempool.so.24.0.symbols 00:02:26.315 [198/705] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:26.315 [199/705] Compiling C object lib/librte_bpf.a.p/bpf_bpf_validate.c.o 00:02:26.315 [200/705] Linking target lib/librte_pci.so.24.0 00:02:26.315 [201/705] Linking target lib/librte_timer.so.24.0 00:02:26.315 [202/705] Linking target lib/librte_mbuf.so.24.0 00:02:26.315 [203/705] Linking target lib/librte_cfgfile.so.24.0 00:02:26.315 [204/705] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:26.315 [205/705] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:26.315 [206/705] Linking static target lib/librte_compressdev.a 00:02:26.315 [207/705] Generating symbol file lib/librte_timer.so.24.0.p/librte_timer.so.24.0.symbols 00:02:26.315 [208/705] Generating symbol file lib/librte_mbuf.so.24.0.p/librte_mbuf.so.24.0.symbols 00:02:26.315 [209/705] Generating symbol file lib/librte_pci.so.24.0.p/librte_pci.so.24.0.symbols 00:02:26.315 [210/705] Linking target lib/librte_net.so.24.0 00:02:26.315 [211/705] Linking target lib/librte_bbdev.so.24.0 00:02:26.576 [212/705] Compiling C object lib/librte_bpf.a.p/bpf_bpf_jit_x86.c.o 00:02:26.576 [213/705] Linking static target lib/librte_bpf.a 00:02:26.576 [214/705] Generating symbol file lib/librte_net.so.24.0.p/librte_net.so.24.0.symbols 00:02:26.576 [215/705] Linking target lib/librte_cmdline.so.24.0 00:02:26.576 [216/705] Linking target lib/librte_hash.so.24.0 00:02:26.576 [217/705] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_match_sse.c.o 00:02:26.576 [218/705] Generating symbol file lib/librte_hash.so.24.0.p/librte_hash.so.24.0.symbols 00:02:26.837 [219/705] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:26.837 [220/705] Generating lib/bpf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.837 [221/705] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor_single.c.o 00:02:26.837 [222/705] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:26.837 [223/705] Linking target lib/librte_compressdev.so.24.0 00:02:26.837 [224/705] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:27.098 [225/705] Compiling C object lib/librte_acl.a.p/acl_acl_run_avx512.c.o 00:02:27.098 [226/705] Linking static target lib/librte_acl.a 00:02:27.098 [227/705] Compiling C object lib/librte_distributor.a.p/distributor_rte_distributor.c.o 00:02:27.098 [228/705] Linking static target lib/librte_distributor.a 00:02:27.098 [229/705] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_private.c.o 00:02:27.098 [230/705] Generating lib/distributor.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.098 [231/705] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:27.098 [232/705] Linking static target lib/librte_dmadev.a 00:02:27.098 [233/705] Compiling C object lib/librte_eventdev.a.p/eventdev_eventdev_trace_points.c.o 00:02:27.098 [234/705] Linking target lib/librte_distributor.so.24.0 00:02:27.358 [235/705] Generating lib/acl.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.358 [236/705] Linking target lib/librte_acl.so.24.0 00:02:27.358 [237/705] Generating symbol file lib/librte_acl.so.24.0.p/librte_acl.so.24.0.symbols 00:02:27.359 [238/705] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.620 [239/705] Linking target lib/librte_dmadev.so.24.0 00:02:27.620 [240/705] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_ring.c.o 00:02:27.620 [241/705] Generating symbol file lib/librte_dmadev.so.24.0.p/librte_dmadev.so.24.0.symbols 00:02:27.620 [242/705] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_dma_adapter.c.o 00:02:27.881 [243/705] Compiling C object lib/librte_efd.a.p/efd_rte_efd.c.o 00:02:27.881 [244/705] Linking static target lib/librte_efd.a 00:02:27.881 [245/705] Generating lib/efd.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.881 [246/705] Linking target lib/librte_efd.so.24.0 00:02:27.881 [247/705] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_crypto_adapter.c.o 00:02:27.881 [248/705] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:28.143 [249/705] Linking static target lib/librte_cryptodev.a 00:02:28.143 [250/705] Compiling C object lib/librte_dispatcher.a.p/dispatcher_rte_dispatcher.c.o 00:02:28.143 [251/705] Linking static target lib/librte_dispatcher.a 00:02:28.143 [252/705] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_timer_adapter.c.o 00:02:28.404 [253/705] Compiling C object lib/librte_gpudev.a.p/gpudev_gpudev.c.o 00:02:28.404 [254/705] Linking static target lib/librte_gpudev.a 00:02:28.404 [255/705] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_tx_adapter.c.o 00:02:28.404 [256/705] Compiling C object lib/librte_gro.a.p/gro_rte_gro.c.o 00:02:28.404 [257/705] Compiling C object lib/librte_gro.a.p/gro_gro_tcp4.c.o 00:02:28.404 [258/705] Generating lib/dispatcher.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.666 [259/705] Compiling C object lib/librte_gro.a.p/gro_gro_tcp6.c.o 00:02:28.666 [260/705] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_eventdev.c.o 00:02:28.927 [261/705] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.927 [262/705] Compiling C object lib/librte_gso.a.p/gso_gso_tcp4.c.o 00:02:28.927 [263/705] Linking target lib/librte_cryptodev.so.24.0 00:02:28.927 [264/705] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_tcp4.c.o 00:02:28.927 [265/705] Compiling C object lib/librte_gro.a.p/gro_gro_udp4.c.o 00:02:28.927 [266/705] Compiling C object lib/librte_gso.a.p/gso_gso_common.c.o 00:02:28.927 [267/705] Generating lib/gpudev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:28.927 [268/705] Linking target lib/librte_gpudev.so.24.0 00:02:28.927 [269/705] Compiling C object lib/librte_gso.a.p/gso_gso_udp4.c.o 00:02:28.927 [270/705] Generating symbol file lib/librte_cryptodev.so.24.0.p/librte_cryptodev.so.24.0.symbols 00:02:28.927 [271/705] Compiling C object lib/librte_gro.a.p/gro_gro_vxlan_udp4.c.o 00:02:28.927 [272/705] Linking static target lib/librte_gro.a 00:02:29.189 [273/705] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_tcp4.c.o 00:02:29.189 [274/705] Compiling C object lib/librte_eventdev.a.p/eventdev_rte_event_eth_rx_adapter.c.o 00:02:29.189 [275/705] Linking static target lib/librte_eventdev.a 00:02:29.189 [276/705] Compiling C object lib/librte_gso.a.p/gso_rte_gso.c.o 00:02:29.189 [277/705] Generating lib/gro.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.189 [278/705] Compiling C object lib/librte_gso.a.p/gso_gso_tunnel_udp4.c.o 00:02:29.189 [279/705] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_reassembly.c.o 00:02:29.189 [280/705] Linking static target lib/librte_gso.a 00:02:29.450 [281/705] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.450 [282/705] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_fragmentation.c.o 00:02:29.450 [283/705] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv6_reassembly.c.o 00:02:29.450 [284/705] Generating lib/gso.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.450 [285/705] Linking target lib/librte_ethdev.so.24.0 00:02:29.450 [286/705] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ipv4_fragmentation.c.o 00:02:29.450 [287/705] Generating symbol file lib/librte_ethdev.so.24.0.p/librte_ethdev.so.24.0.symbols 00:02:29.450 [288/705] Linking target lib/librte_metrics.so.24.0 00:02:29.450 [289/705] Compiling C object lib/librte_ip_frag.a.p/ip_frag_rte_ip_frag_common.c.o 00:02:29.450 [290/705] Compiling C object lib/librte_jobstats.a.p/jobstats_rte_jobstats.c.o 00:02:29.450 [291/705] Linking target lib/librte_bpf.so.24.0 00:02:29.450 [292/705] Linking target lib/librte_gro.so.24.0 00:02:29.711 [293/705] Generating symbol file lib/librte_metrics.so.24.0.p/librte_metrics.so.24.0.symbols 00:02:29.711 [294/705] Compiling C object lib/librte_ip_frag.a.p/ip_frag_ip_frag_internal.c.o 00:02:29.712 [295/705] Linking target lib/librte_gso.so.24.0 00:02:29.712 [296/705] Linking static target lib/librte_ip_frag.a 00:02:29.712 [297/705] Linking static target lib/librte_jobstats.a 00:02:29.712 [298/705] Generating symbol file lib/librte_bpf.so.24.0.p/librte_bpf.so.24.0.symbols 00:02:29.712 [299/705] Linking target lib/librte_bitratestats.so.24.0 00:02:29.712 [300/705] Compiling C object lib/librte_latencystats.a.p/latencystats_rte_latencystats.c.o 00:02:29.712 [301/705] Linking static target lib/librte_latencystats.a 00:02:29.712 [302/705] Compiling C object lib/librte_member.a.p/member_rte_member.c.o 00:02:29.712 [303/705] Generating lib/ip_frag.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.712 [304/705] Linking target lib/librte_ip_frag.so.24.0 00:02:29.712 [305/705] Generating lib/jobstats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.974 [306/705] Generating lib/latencystats.sym_chk with a custom command (wrapped by meson to capture output) 00:02:29.974 [307/705] Linking target lib/librte_jobstats.so.24.0 00:02:29.974 [308/705] Linking target lib/librte_latencystats.so.24.0 00:02:29.974 [309/705] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm.c.o 00:02:29.974 [310/705] Generating symbol file lib/librte_ip_frag.so.24.0.p/librte_ip_frag.so.24.0.symbols 00:02:29.974 [311/705] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:29.974 [312/705] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:30.235 [313/705] Compiling C object lib/librte_member.a.p/member_rte_member_sketch_avx512.c.o 00:02:30.235 [314/705] Compiling C object lib/librte_member.a.p/member_rte_member_ht.c.o 00:02:30.235 [315/705] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:30.235 [316/705] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:30.235 [317/705] Compiling C object lib/librte_lpm.a.p/lpm_rte_lpm6.c.o 00:02:30.235 [318/705] Linking static target lib/librte_lpm.a 00:02:30.235 [319/705] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:30.235 [320/705] Compiling C object lib/librte_member.a.p/member_rte_member_vbf.c.o 00:02:30.497 [321/705] Generating lib/lpm.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.497 [322/705] Linking target lib/librte_lpm.so.24.0 00:02:30.497 [323/705] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:30.497 [324/705] Compiling C object lib/librte_pcapng.a.p/pcapng_rte_pcapng.c.o 00:02:30.497 [325/705] Linking static target lib/librte_pcapng.a 00:02:30.497 [326/705] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:30.497 [327/705] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:30.497 [328/705] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:30.497 [329/705] Generating symbol file lib/librte_lpm.so.24.0.p/librte_lpm.so.24.0.symbols 00:02:30.757 [330/705] Generating lib/eventdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.757 [331/705] Generating lib/pcapng.sym_chk with a custom command (wrapped by meson to capture output) 00:02:30.757 [332/705] Linking target lib/librte_eventdev.so.24.0 00:02:30.757 [333/705] Linking target lib/librte_pcapng.so.24.0 00:02:30.757 [334/705] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:30.757 [335/705] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils.c.o 00:02:30.757 [336/705] Generating symbol file lib/librte_eventdev.so.24.0.p/librte_eventdev.so.24.0.symbols 00:02:30.757 [337/705] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:30.757 [338/705] Generating symbol file lib/librte_pcapng.so.24.0.p/librte_pcapng.so.24.0.symbols 00:02:30.757 [339/705] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev_pmd.c.o 00:02:30.757 [340/705] Linking static target lib/librte_power.a 00:02:30.757 [341/705] Linking target lib/librte_dispatcher.so.24.0 00:02:31.019 [342/705] Compiling C object lib/librte_rawdev.a.p/rawdev_rte_rawdev.c.o 00:02:31.019 [343/705] Linking static target lib/librte_rawdev.a 00:02:31.019 [344/705] Compiling C object lib/librte_mldev.a.p/mldev_rte_mldev.c.o 00:02:31.019 [345/705] Compiling C object lib/librte_regexdev.a.p/regexdev_rte_regexdev.c.o 00:02:31.019 [346/705] Linking static target lib/librte_regexdev.a 00:02:31.019 [347/705] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar_bfloat16.c.o 00:02:31.281 [348/705] Generating lib/rawdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.281 [349/705] Compiling C object lib/librte_mldev.a.p/mldev_mldev_utils_scalar.c.o 00:02:31.281 [350/705] Linking static target lib/librte_mldev.a 00:02:31.281 [351/705] Linking target lib/librte_rawdev.so.24.0 00:02:31.281 [352/705] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.281 [353/705] Linking target lib/librte_power.so.24.0 00:02:31.281 [354/705] Compiling C object lib/librte_sched.a.p/sched_rte_red.c.o 00:02:31.281 [355/705] Compiling C object lib/librte_rib.a.p/rib_rte_rib.c.o 00:02:31.281 [356/705] Compiling C object lib/librte_member.a.p/member_rte_member_sketch.c.o 00:02:31.281 [357/705] Linking static target lib/librte_member.a 00:02:31.281 [358/705] Compiling C object lib/librte_sched.a.p/sched_rte_approx.c.o 00:02:31.543 [359/705] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:31.543 [360/705] Linking static target lib/librte_reorder.a 00:02:31.543 [361/705] Generating lib/regexdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.543 [362/705] Linking target lib/librte_regexdev.so.24.0 00:02:31.543 [363/705] Compiling C object lib/librte_sched.a.p/sched_rte_pie.c.o 00:02:31.543 [364/705] Generating lib/member.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.543 [365/705] Linking target lib/librte_member.so.24.0 00:02:31.543 [366/705] Compiling C object lib/librte_stack.a.p/stack_rte_stack.c.o 00:02:31.543 [367/705] Compiling C object lib/librte_stack.a.p/stack_rte_stack_std.c.o 00:02:31.543 [368/705] Compiling C object lib/librte_rib.a.p/rib_rte_rib6.c.o 00:02:31.543 [369/705] Linking static target lib/librte_rib.a 00:02:31.543 [370/705] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.805 [371/705] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:31.805 [372/705] Linking target lib/librte_reorder.so.24.0 00:02:31.805 [373/705] Compiling C object lib/librte_stack.a.p/stack_rte_stack_lf.c.o 00:02:31.805 [374/705] Linking static target lib/librte_stack.a 00:02:31.805 [375/705] Generating symbol file lib/librte_reorder.so.24.0.p/librte_reorder.so.24.0.symbols 00:02:31.805 [376/705] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:31.805 [377/705] Linking static target lib/librte_security.a 00:02:31.805 [378/705] Generating lib/rib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.805 [379/705] Generating lib/stack.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.066 [380/705] Linking target lib/librte_rib.so.24.0 00:02:32.066 [381/705] Linking target lib/librte_stack.so.24.0 00:02:32.066 [382/705] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:32.066 [383/705] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:32.066 [384/705] Generating symbol file lib/librte_rib.so.24.0.p/librte_rib.so.24.0.symbols 00:02:32.066 [385/705] Generating lib/mldev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.066 [386/705] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:32.066 [387/705] Linking target lib/librte_mldev.so.24.0 00:02:32.066 [388/705] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.066 [389/705] Linking target lib/librte_security.so.24.0 00:02:32.327 [390/705] Generating symbol file lib/librte_security.so.24.0.p/librte_security.so.24.0.symbols 00:02:32.327 [391/705] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:32.327 [392/705] Compiling C object lib/librte_sched.a.p/sched_rte_sched.c.o 00:02:32.327 [393/705] Linking static target lib/librte_sched.a 00:02:32.327 [394/705] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:32.588 [395/705] Compiling C object lib/librte_ipsec.a.p/ipsec_sa.c.o 00:02:32.588 [396/705] Compiling C object lib/librte_ipsec.a.p/ipsec_ses.c.o 00:02:32.910 [397/705] Generating lib/sched.sym_chk with a custom command (wrapped by meson to capture output) 00:02:32.910 [398/705] Linking target lib/librte_sched.so.24.0 00:02:32.910 [399/705] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:32.910 [400/705] Generating symbol file lib/librte_sched.so.24.0.p/librte_sched.so.24.0.symbols 00:02:32.910 [401/705] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:32.910 [402/705] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_telemetry.c.o 00:02:32.910 [403/705] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_cnt.c.o 00:02:33.171 [404/705] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_crypto.c.o 00:02:33.171 [405/705] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_ctrl_pdu.c.o 00:02:33.171 [406/705] Compiling C object lib/librte_ipsec.a.p/ipsec_ipsec_sad.c.o 00:02:33.171 [407/705] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_reorder.c.o 00:02:33.171 [408/705] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_outb.c.o 00:02:33.171 [409/705] Compiling C object lib/librte_ipsec.a.p/ipsec_esp_inb.c.o 00:02:33.171 [410/705] Linking static target lib/librte_ipsec.a 00:02:33.434 [411/705] Compiling C object lib/librte_fib.a.p/fib_rte_fib.c.o 00:02:33.434 [412/705] Compiling C object lib/librte_fib.a.p/fib_rte_fib6.c.o 00:02:33.434 [413/705] Compiling C object lib/librte_pdcp.a.p/pdcp_rte_pdcp.c.o 00:02:33.434 [414/705] Generating lib/ipsec.sym_chk with a custom command (wrapped by meson to capture output) 00:02:33.434 [415/705] Linking target lib/librte_ipsec.so.24.0 00:02:33.434 [416/705] Compiling C object lib/librte_fib.a.p/fib_dir24_8_avx512.c.o 00:02:33.696 [417/705] Generating symbol file lib/librte_ipsec.so.24.0.p/librte_ipsec.so.24.0.symbols 00:02:33.696 [418/705] Compiling C object lib/librte_fib.a.p/fib_trie_avx512.c.o 00:02:33.696 [419/705] Compiling C object lib/librte_fib.a.p/fib_trie.c.o 00:02:33.958 [420/705] Compiling C object lib/librte_fib.a.p/fib_dir24_8.c.o 00:02:33.958 [421/705] Linking static target lib/librte_fib.a 00:02:33.958 [422/705] Compiling C object lib/librte_port.a.p/port_rte_port_ethdev.c.o 00:02:33.958 [423/705] Compiling C object lib/librte_port.a.p/port_rte_port_fd.c.o 00:02:33.958 [424/705] Compiling C object lib/librte_port.a.p/port_rte_port_frag.c.o 00:02:33.958 [425/705] Compiling C object lib/librte_port.a.p/port_rte_port_sched.c.o 00:02:33.958 [426/705] Compiling C object lib/librte_port.a.p/port_rte_port_ras.c.o 00:02:33.958 [427/705] Generating lib/fib.sym_chk with a custom command (wrapped by meson to capture output) 00:02:33.958 [428/705] Linking target lib/librte_fib.so.24.0 00:02:34.219 [429/705] Compiling C object lib/librte_pdcp.a.p/pdcp_pdcp_process.c.o 00:02:34.219 [430/705] Linking static target lib/librte_pdcp.a 00:02:34.219 [431/705] Generating lib/pdcp.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.219 [432/705] Linking target lib/librte_pdcp.so.24.0 00:02:34.481 [433/705] Compiling C object lib/librte_port.a.p/port_rte_port_sym_crypto.c.o 00:02:34.481 [434/705] Compiling C object lib/librte_port.a.p/port_rte_port_source_sink.c.o 00:02:34.481 [435/705] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ethdev.c.o 00:02:34.481 [436/705] Compiling C object lib/librte_port.a.p/port_rte_swx_port_fd.c.o 00:02:34.481 [437/705] Compiling C object lib/librte_port.a.p/port_rte_port_eventdev.c.o 00:02:34.481 [438/705] Compiling C object lib/librte_table.a.p/table_rte_swx_keycmp.c.o 00:02:34.743 [439/705] Compiling C object lib/librte_port.a.p/port_rte_swx_port_ring.c.o 00:02:34.743 [440/705] Compiling C object lib/librte_port.a.p/port_rte_swx_port_source_sink.c.o 00:02:34.743 [441/705] Compiling C object lib/librte_table.a.p/table_rte_swx_table_learner.c.o 00:02:34.743 [442/705] Compiling C object lib/librte_table.a.p/table_rte_swx_table_selector.c.o 00:02:35.005 [443/705] Compiling C object lib/librte_table.a.p/table_rte_swx_table_wm.c.o 00:02:35.005 [444/705] Compiling C object lib/librte_table.a.p/table_rte_swx_table_em.c.o 00:02:35.005 [445/705] Compiling C object lib/librte_port.a.p/port_rte_port_ring.c.o 00:02:35.005 [446/705] Linking static target lib/librte_port.a 00:02:35.005 [447/705] Compiling C object lib/librte_table.a.p/table_rte_table_array.c.o 00:02:35.005 [448/705] Compiling C object lib/librte_pdump.a.p/pdump_rte_pdump.c.o 00:02:35.005 [449/705] Linking static target lib/librte_pdump.a 00:02:35.005 [450/705] Compiling C object lib/librte_table.a.p/table_rte_table_acl.c.o 00:02:35.264 [451/705] Compiling C object lib/librte_table.a.p/table_rte_table_hash_cuckoo.c.o 00:02:35.264 [452/705] Generating lib/pdump.sym_chk with a custom command (wrapped by meson to capture output) 00:02:35.264 [453/705] Generating lib/port.sym_chk with a custom command (wrapped by meson to capture output) 00:02:35.264 [454/705] Linking target lib/librte_pdump.so.24.0 00:02:35.264 [455/705] Linking target lib/librte_port.so.24.0 00:02:35.524 [456/705] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:35.524 [457/705] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key8.c.o 00:02:35.524 [458/705] Compiling C object lib/librte_table.a.p/table_rte_table_lpm.c.o 00:02:35.524 [459/705] Generating symbol file lib/librte_port.so.24.0.p/librte_port.so.24.0.symbols 00:02:35.524 [460/705] Compiling C object lib/librte_table.a.p/table_rte_table_hash_ext.c.o 00:02:35.524 [461/705] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key16.c.o 00:02:35.524 [462/705] Compiling C object lib/librte_table.a.p/table_rte_table_lpm_ipv6.c.o 00:02:35.785 [463/705] Compiling C object lib/librte_table.a.p/table_rte_table_stub.c.o 00:02:35.785 [464/705] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_port_in_action.c.o 00:02:35.786 [465/705] Compiling C object lib/librte_table.a.p/table_rte_table_hash_lru.c.o 00:02:35.786 [466/705] Compiling C object lib/librte_table.a.p/table_rte_table_hash_key32.c.o 00:02:35.786 [467/705] Linking static target lib/librte_table.a 00:02:36.046 [468/705] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_pipeline.c.o 00:02:36.308 [469/705] Compiling C object lib/librte_graph.a.p/graph_node.c.o 00:02:36.308 [470/705] Generating lib/table.sym_chk with a custom command (wrapped by meson to capture output) 00:02:36.308 [471/705] Compiling C object lib/librte_graph.a.p/graph_graph.c.o 00:02:36.308 [472/705] Linking target lib/librte_table.so.24.0 00:02:36.308 [473/705] Compiling C object lib/librte_graph.a.p/graph_graph_ops.c.o 00:02:36.308 [474/705] Generating symbol file lib/librte_table.so.24.0.p/librte_table.so.24.0.symbols 00:02:36.308 [475/705] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ipsec.c.o 00:02:36.308 [476/705] Compiling C object lib/librte_graph.a.p/graph_graph_debug.c.o 00:02:36.569 [477/705] Compiling C object lib/librte_graph.a.p/graph_graph_populate.c.o 00:02:36.569 [478/705] Compiling C object lib/librte_graph.a.p/graph_rte_graph_worker.c.o 00:02:36.569 [479/705] Compiling C object lib/librte_graph.a.p/graph_graph_stats.c.o 00:02:36.569 [480/705] Compiling C object lib/librte_graph.a.p/graph_graph_pcap.c.o 00:02:36.832 [481/705] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_ctl.c.o 00:02:36.832 [482/705] Compiling C object lib/librte_node.a.p/node_ethdev_ctrl.c.o 00:02:36.832 [483/705] Compiling C object lib/librte_graph.a.p/graph_rte_graph_model_mcore_dispatch.c.o 00:02:36.832 [484/705] Linking static target lib/librte_graph.a 00:02:37.093 [485/705] Compiling C object lib/librte_node.a.p/node_ethdev_tx.c.o 00:02:37.093 [486/705] Compiling C object lib/librte_node.a.p/node_ip4_local.c.o 00:02:37.093 [487/705] Compiling C object lib/librte_node.a.p/node_ethdev_rx.c.o 00:02:37.093 [488/705] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline_spec.c.o 00:02:37.353 [489/705] Compiling C object lib/librte_node.a.p/node_ip4_reassembly.c.o 00:02:37.353 [490/705] Generating lib/graph.sym_chk with a custom command (wrapped by meson to capture output) 00:02:37.353 [491/705] Linking target lib/librte_graph.so.24.0 00:02:37.353 [492/705] Compiling C object lib/librte_node.a.p/node_ip4_lookup.c.o 00:02:37.614 [493/705] Generating symbol file lib/librte_graph.so.24.0.p/librte_graph.so.24.0.symbols 00:02:37.614 [494/705] Compiling C object lib/librte_node.a.p/node_null.c.o 00:02:37.614 [495/705] Compiling C object lib/librte_node.a.p/node_kernel_tx.c.o 00:02:37.614 [496/705] Compiling C object lib/librte_node.a.p/node_ip6_lookup.c.o 00:02:37.614 [497/705] Compiling C object lib/librte_node.a.p/node_ip4_rewrite.c.o 00:02:37.614 [498/705] Compiling C object lib/librte_node.a.p/node_log.c.o 00:02:37.934 [499/705] Compiling C object lib/librte_node.a.p/node_kernel_rx.c.o 00:02:37.934 [500/705] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:37.934 [501/705] Compiling C object lib/librte_node.a.p/node_pkt_drop.c.o 00:02:37.934 [502/705] Compiling C object lib/librte_node.a.p/node_ip6_rewrite.c.o 00:02:37.934 [503/705] Compiling C object lib/librte_node.a.p/node_pkt_cls.c.o 00:02:37.934 [504/705] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:37.934 [505/705] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:38.196 [506/705] Compiling C object lib/librte_node.a.p/node_udp4_input.c.o 00:02:38.196 [507/705] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:38.196 [508/705] Linking static target lib/librte_node.a 00:02:38.196 [509/705] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:38.196 [510/705] Generating lib/node.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.196 [511/705] Linking target lib/librte_node.so.24.0 00:02:38.457 [512/705] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:38.457 [513/705] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:38.457 [514/705] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:38.457 [515/705] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:38.457 [516/705] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:38.457 [517/705] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:38.457 [518/705] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:38.457 [519/705] Linking static target drivers/librte_bus_vdev.a 00:02:38.457 [520/705] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:38.457 [521/705] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:38.457 [522/705] Linking static target drivers/librte_bus_pci.a 00:02:38.719 [523/705] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_adminq.c.o 00:02:38.719 [524/705] Compiling C object drivers/librte_bus_pci.so.24.0.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:38.719 [525/705] Compiling C object drivers/librte_bus_vdev.so.24.0.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:38.719 [526/705] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.719 [527/705] Linking target drivers/librte_bus_vdev.so.24.0 00:02:38.719 [528/705] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_diag.c.o 00:02:38.719 [529/705] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_dcb.c.o 00:02:38.719 [530/705] Generating symbol file drivers/librte_bus_vdev.so.24.0.p/librte_bus_vdev.so.24.0.symbols 00:02:38.719 [531/705] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:38.719 [532/705] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:38.980 [533/705] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.980 [534/705] Linking target drivers/librte_bus_pci.so.24.0 00:02:38.980 [535/705] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:38.980 [536/705] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_hmc.c.o 00:02:38.980 [537/705] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:38.980 [538/705] Linking static target drivers/librte_mempool_ring.a 00:02:38.980 [539/705] Compiling C object drivers/librte_mempool_ring.so.24.0.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:38.980 [540/705] Linking target drivers/librte_mempool_ring.so.24.0 00:02:38.980 [541/705] Generating symbol file drivers/librte_bus_pci.so.24.0.p/librte_bus_pci.so.24.0.symbols 00:02:39.241 [542/705] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_lan_hmc.c.o 00:02:39.241 [543/705] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_nvm.c.o 00:02:39.501 [544/705] Compiling C object drivers/net/i40e/base/libi40e_base.a.p/i40e_common.c.o 00:02:39.501 [545/705] Linking static target drivers/net/i40e/base/libi40e_base.a 00:02:40.071 [546/705] Compiling C object drivers/net/i40e/libi40e_avx512_lib.a.p/i40e_rxtx_vec_avx512.c.o 00:02:40.071 [547/705] Linking static target drivers/net/i40e/libi40e_avx512_lib.a 00:02:40.072 [548/705] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_pf.c.o 00:02:40.072 [549/705] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_fdir.c.o 00:02:40.072 [550/705] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_swx_pipeline.c.o 00:02:40.333 [551/705] Compiling C object drivers/net/i40e/libi40e_avx2_lib.a.p/i40e_rxtx_vec_avx2.c.o 00:02:40.334 [552/705] Linking static target drivers/net/i40e/libi40e_avx2_lib.a 00:02:40.334 [553/705] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_flow.c.o 00:02:40.596 [554/705] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_vf_representor.c.o 00:02:40.596 [555/705] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_recycle_mbufs_vec_common.c.o 00:02:40.596 [556/705] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_tm.c.o 00:02:40.859 [557/705] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_hash.c.o 00:02:40.859 [558/705] Compiling C object app/dpdk-graph.p/graph_cli.c.o 00:02:40.859 [559/705] Compiling C object app/dpdk-graph.p/graph_conn.c.o 00:02:41.121 [560/705] Compiling C object app/dpdk-dumpcap.p/dumpcap_main.c.o 00:02:41.121 [561/705] Compiling C object app/dpdk-graph.p/graph_ethdev_rx.c.o 00:02:41.121 [562/705] Compiling C object app/dpdk-graph.p/graph_ethdev.c.o 00:02:41.382 [563/705] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx.c.o 00:02:41.382 [564/705] Compiling C object app/dpdk-graph.p/graph_ip4_route.c.o 00:02:41.382 [565/705] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_rxtx_vec_sse.c.o 00:02:41.382 [566/705] Compiling C object app/dpdk-graph.p/graph_graph.c.o 00:02:41.382 [567/705] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_rte_pmd_i40e.c.o 00:02:41.644 [568/705] Compiling C object app/dpdk-graph.p/graph_main.c.o 00:02:41.644 [569/705] Compiling C object app/dpdk-graph.p/graph_mempool.c.o 00:02:41.644 [570/705] Compiling C object app/dpdk-graph.p/graph_l3fwd.c.o 00:02:41.644 [571/705] Compiling C object app/dpdk-graph.p/graph_ip6_route.c.o 00:02:41.644 [572/705] Compiling C object app/dpdk-graph.p/graph_neigh.c.o 00:02:41.906 [573/705] Compiling C object app/dpdk-graph.p/graph_utils.c.o 00:02:41.906 [574/705] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_main.c.o 00:02:41.906 [575/705] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_commands.c.o 00:02:41.906 [576/705] Compiling C object app/dpdk-test-cmdline.p/test-cmdline_cmdline_test.c.o 00:02:41.906 [577/705] Compiling C object drivers/libtmp_rte_net_i40e.a.p/net_i40e_i40e_ethdev.c.o 00:02:41.906 [578/705] Linking static target drivers/libtmp_rte_net_i40e.a 00:02:42.167 [579/705] Compiling C object app/dpdk-test-acl.p/test-acl_main.c.o 00:02:42.167 [580/705] Generating drivers/rte_net_i40e.pmd.c with a custom command 00:02:42.167 [581/705] Compiling C object app/dpdk-pdump.p/pdump_main.c.o 00:02:42.167 [582/705] Compiling C object drivers/librte_net_i40e.a.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:42.167 [583/705] Compiling C object drivers/librte_net_i40e.so.24.0.p/meson-generated_.._rte_net_i40e.pmd.c.o 00:02:42.167 [584/705] Linking static target drivers/librte_net_i40e.a 00:02:42.167 [585/705] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_options_parse.c.o 00:02:42.167 [586/705] Compiling C object app/dpdk-proc-info.p/proc-info_main.c.o 00:02:42.429 [587/705] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_vector.c.o 00:02:42.429 [588/705] Generating drivers/rte_net_i40e.sym_chk with a custom command (wrapped by meson to capture output) 00:02:42.735 [589/705] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev.c.o 00:02:42.735 [590/705] Linking target drivers/librte_net_i40e.so.24.0 00:02:42.735 [591/705] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_throughput.c.o 00:02:42.735 [592/705] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_common.c.o 00:02:42.735 [593/705] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_verify.c.o 00:02:42.995 [594/705] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_main.c.o 00:02:42.995 [595/705] Compiling C object app/dpdk-test-compress-perf.p/test-compress-perf_comp_perf_test_cyclecount.c.o 00:02:42.995 [596/705] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_common.c.o 00:02:42.995 [597/705] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_ops.c.o 00:02:42.995 [598/705] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_options_parsing.c.o 00:02:43.256 [599/705] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vector_parsing.c.o 00:02:43.256 [600/705] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_vectors.c.o 00:02:43.256 [601/705] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_latency.c.o 00:02:43.256 [602/705] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_throughput.c.o 00:02:43.518 [603/705] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_pmd_cyclecount.c.o 00:02:43.518 [604/705] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_test.c.o 00:02:43.518 [605/705] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_main.c.o 00:02:43.518 [606/705] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_parser.c.o 00:02:43.518 [607/705] Compiling C object app/dpdk-test-crypto-perf.p/test-crypto-perf_cperf_test_verify.c.o 00:02:43.518 [608/705] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_main.c.o 00:02:43.777 [609/705] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_main.c.o 00:02:43.777 [610/705] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_evt_options.c.o 00:02:43.777 [611/705] Compiling C object app/dpdk-test-dma-perf.p/test-dma-perf_benchmark.c.o 00:02:44.036 [612/705] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_common.c.o 00:02:44.036 [613/705] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_atq.c.o 00:02:44.036 [614/705] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_order_queue.c.o 00:02:44.604 [615/705] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_atq.c.o 00:02:44.604 [616/705] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:44.605 [617/705] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_atq.c.o 00:02:44.605 [618/705] Linking static target lib/librte_vhost.a 00:02:44.605 [619/705] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_common.c.o 00:02:44.605 [620/705] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_queue.c.o 00:02:44.605 [621/705] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_pipeline_queue.c.o 00:02:44.605 [622/705] Compiling C object app/dpdk-test-fib.p/test-fib_main.c.o 00:02:44.863 [623/705] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_flow_gen.c.o 00:02:44.863 [624/705] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_items_gen.c.o 00:02:44.863 [625/705] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_test.c.o 00:02:44.863 [626/705] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_main.c.o 00:02:44.863 [627/705] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_actions_gen.c.o 00:02:45.121 [628/705] Compiling C object app/dpdk-test-gpudev.p/test-gpudev_main.c.o 00:02:45.121 [629/705] Compiling C object app/dpdk-test-mldev.p/test-mldev_parser.c.o 00:02:45.121 [630/705] Compiling C object app/dpdk-test-mldev.p/test-mldev_ml_options.c.o 00:02:45.121 [631/705] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_device_ops.c.o 00:02:45.378 [632/705] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_common.c.o 00:02:45.378 [633/705] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_common.c.o 00:02:45.378 [634/705] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:45.378 [635/705] Linking target lib/librte_vhost.so.24.0 00:02:45.378 [636/705] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_model_ops.c.o 00:02:45.378 [637/705] Compiling C object app/dpdk-test-flow-perf.p/test-flow-perf_main.c.o 00:02:45.378 [638/705] Compiling C object lib/librte_pipeline.a.p/pipeline_rte_table_action.c.o 00:02:45.636 [639/705] Linking static target lib/librte_pipeline.a 00:02:45.636 [640/705] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_interleave.c.o 00:02:45.636 [641/705] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_stats.c.o 00:02:45.636 [642/705] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_ordered.c.o 00:02:45.636 [643/705] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_config.c.o 00:02:45.636 [644/705] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_init.c.o 00:02:45.898 [645/705] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_main.c.o 00:02:45.898 [646/705] Linking target app/dpdk-dumpcap 00:02:45.898 [647/705] Linking target app/dpdk-graph 00:02:45.898 [648/705] Linking target app/dpdk-proc-info 00:02:45.898 [649/705] Linking target app/dpdk-pdump 00:02:45.898 [650/705] Linking target app/dpdk-test-acl 00:02:46.156 [651/705] Linking target app/dpdk-test-cmdline 00:02:46.156 [652/705] Linking target app/dpdk-test-compress-perf 00:02:46.156 [653/705] Linking target app/dpdk-test-crypto-perf 00:02:46.156 [654/705] Linking target app/dpdk-test-fib 00:02:46.156 [655/705] Linking target app/dpdk-test-dma-perf 00:02:46.156 [656/705] Linking target app/dpdk-test-flow-perf 00:02:46.413 [657/705] Linking target app/dpdk-test-gpudev 00:02:46.413 [658/705] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_acl.c.o 00:02:46.413 [659/705] Compiling C object app/dpdk-test-eventdev.p/test-eventdev_test_perf_common.c.o 00:02:46.413 [660/705] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_stub.c.o 00:02:46.413 [661/705] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm.c.o 00:02:46.413 [662/705] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_lpm_ipv6.c.o 00:02:46.413 [663/705] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_pipeline_hash.c.o 00:02:46.670 [664/705] Compiling C object app/dpdk-test-mldev.p/test-mldev_test_inference_common.c.o 00:02:46.670 [665/705] Compiling C object app/dpdk-test-bbdev.p/test-bbdev_test_bbdev_perf.c.o 00:02:46.670 [666/705] Compiling C object app/dpdk-testpmd.p/test-pmd_5tswap.c.o 00:02:46.929 [667/705] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_cman.c.o 00:02:46.929 [668/705] Linking target app/dpdk-test-eventdev 00:02:46.929 [669/705] Compiling C object app/dpdk-test-pipeline.p/test-pipeline_runtime.c.o 00:02:46.929 [670/705] Linking target app/dpdk-test-mldev 00:02:46.929 [671/705] Linking target app/dpdk-test-bbdev 00:02:46.929 [672/705] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_mtr.c.o 00:02:47.187 [673/705] Compiling C object app/dpdk-testpmd.p/test-pmd_cmd_flex_item.c.o 00:02:47.187 [674/705] Linking target app/dpdk-test-pipeline 00:02:47.187 [675/705] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_tm.c.o 00:02:47.445 [676/705] Compiling C object app/dpdk-testpmd.p/test-pmd_flowgen.c.o 00:02:47.445 [677/705] Generating lib/pipeline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:47.704 [678/705] Compiling C object app/dpdk-testpmd.p/test-pmd_iofwd.c.o 00:02:47.704 [679/705] Linking target lib/librte_pipeline.so.24.0 00:02:47.704 [680/705] Compiling C object app/dpdk-testpmd.p/test-pmd_ieee1588fwd.c.o 00:02:47.704 [681/705] Compiling C object app/dpdk-testpmd.p/test-pmd_macfwd.c.o 00:02:47.704 [682/705] Compiling C object app/dpdk-testpmd.p/test-pmd_icmpecho.c.o 00:02:47.704 [683/705] Compiling C object app/dpdk-testpmd.p/test-pmd_macswap.c.o 00:02:47.962 [684/705] Compiling C object app/dpdk-testpmd.p/test-pmd_rxonly.c.o 00:02:47.962 [685/705] Compiling C object app/dpdk-testpmd.p/test-pmd_csumonly.c.o 00:02:47.962 [686/705] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline.c.o 00:02:47.962 [687/705] Compiling C object app/dpdk-testpmd.p/test-pmd_recycle_mbufs.c.o 00:02:47.962 [688/705] Compiling C object app/dpdk-testpmd.p/test-pmd_shared_rxq_fwd.c.o 00:02:48.220 [689/705] Compiling C object app/dpdk-testpmd.p/test-pmd_parameters.c.o 00:02:48.220 [690/705] Compiling C object app/dpdk-testpmd.p/test-pmd_bpf_cmd.c.o 00:02:48.479 [691/705] Compiling C object app/dpdk-testpmd.p/test-pmd_util.c.o 00:02:48.479 [692/705] Compiling C object app/dpdk-test-sad.p/test-sad_main.c.o 00:02:48.740 [693/705] Compiling C object app/dpdk-test-security-perf.p/test-security-perf_test_security_perf.c.o 00:02:48.740 [694/705] Compiling C object app/dpdk-testpmd.p/.._drivers_net_i40e_i40e_testpmd.c.o 00:02:48.740 [695/705] Compiling C object app/dpdk-test-regex.p/test-regex_main.c.o 00:02:48.740 [696/705] Compiling C object app/dpdk-testpmd.p/test-pmd_txonly.c.o 00:02:48.999 [697/705] Linking target app/dpdk-test-sad 00:02:48.999 [698/705] Compiling C object app/dpdk-testpmd.p/test-pmd_config.c.o 00:02:48.999 [699/705] Compiling C object app/dpdk-testpmd.p/test-pmd_testpmd.c.o 00:02:48.999 [700/705] Compiling C object app/dpdk-testpmd.p/test-pmd_cmdline_flow.c.o 00:02:49.260 [701/705] Linking target app/dpdk-test-regex 00:02:49.260 [702/705] Compiling C object app/dpdk-test-security-perf.p/test_test_cryptodev_security_ipsec.c.o 00:02:49.260 [703/705] Compiling C object app/dpdk-testpmd.p/test-pmd_noisy_vnf.c.o 00:02:49.520 [704/705] Linking target app/dpdk-test-security-perf 00:02:49.520 [705/705] Linking target app/dpdk-testpmd 00:02:49.520 04:53:09 build_native_dpdk -- common/autobuild_common.sh@201 -- $ uname -s 00:02:49.520 04:53:09 build_native_dpdk -- common/autobuild_common.sh@201 -- $ [[ Linux == \F\r\e\e\B\S\D ]] 00:02:49.521 04:53:09 build_native_dpdk -- common/autobuild_common.sh@214 -- $ ninja -C /home/vagrant/spdk_repo/dpdk/build-tmp -j10 install 00:02:49.521 ninja: Entering directory `/home/vagrant/spdk_repo/dpdk/build-tmp' 00:02:49.783 [0/1] Installing files. 00:02:49.783 Installing subdir /home/vagrant/spdk_repo/dpdk/examples to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/bbdev_app/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bbdev_app 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/bbdev_app/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bbdev_app 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/bond/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bond 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/bond/commands.list to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bond 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/bond/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bond 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/bpf/README to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bpf 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/bpf/dummy.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bpf 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/bpf/t1.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bpf 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/bpf/t2.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bpf 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/bpf/t3.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/bpf 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/cmdline/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/cmdline 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/cmdline/commands.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/cmdline 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/cmdline/commands.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/cmdline 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/cmdline/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/cmdline 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/cmdline/parse_obj_list.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/cmdline 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/cmdline/parse_obj_list.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/cmdline 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/common/pkt_group.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/common 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/common/altivec/port_group.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/common/altivec 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/common/neon/port_group.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/common/neon 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/common/sse/port_group.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/common/sse 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/distributor/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/distributor 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/distributor/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/distributor 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/dma/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/dma 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/dma/dmafwd.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/dma 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/ethtool-app/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/ethtool-app/ethapp.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/ethtool-app/ethapp.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/ethtool-app/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool/ethtool-app 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/lib/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/lib/rte_ethtool.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/ethtool/lib/rte_ethtool.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ethtool/lib 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/eventdev_pipeline/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/eventdev_pipeline/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/eventdev_pipeline/pipeline_common.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/eventdev_pipeline/pipeline_worker_generic.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/eventdev_pipeline/pipeline_worker_tx.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/eventdev_pipeline 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_dev_self_test.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_dev_self_test.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_aes.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_ccm.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_cmac.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_ecdsa.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_gcm.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_hmac.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_rsa.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_sha.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_tdes.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/fips_validation_xts.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/fips_validation/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/fips_validation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/flow_filtering/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/flow_filtering 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/flow_filtering/flow_blocks.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/flow_filtering 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/flow_filtering/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/flow_filtering 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/helloworld/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/helloworld 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/helloworld/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/helloworld 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_fragmentation/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_fragmentation/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_fragmentation 00:02:49.783 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/action.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/action.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/cli.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/cli.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/common.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/conn.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/conn.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/cryptodev.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/cryptodev.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/link.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/link.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/mempool.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/mempool.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/parser.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/parser.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/pipeline.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/pipeline.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/swq.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/swq.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/tap.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/tap.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/thread.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/thread.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/tmgr.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/tmgr.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/firewall.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/flow.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/flow_crypto.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/l2fwd.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/route.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/route_ecmp.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/rss.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_pipeline/examples/tap.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_pipeline/examples 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_reassembly/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ip_reassembly/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ip_reassembly 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ep0.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ep1.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/esp.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/esp.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/event_helper.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/event_helper.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/flow.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/flow.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipip.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec-secgw.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec-secgw.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec_lpm_neon.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec_neon.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec_process.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec_worker.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/ipsec_worker.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/parser.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/parser.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/rt.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/sa.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/sad.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/sad.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/sp4.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/sp6.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/bypass_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/common_defs_secgw.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/data_rxtx.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/linux_test.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/load_env.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/pkttest.py to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/pkttest.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/run_test.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_3descbc_sha1_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.784 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_aescbc_sha1_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_aesctr_sha1_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_aesgcm_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_aesgcm_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/trs_ipv6opts.py to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_3descbc_sha1_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_aescbc_sha1_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_aesctr_sha1_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_aesgcm_common_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_aesgcm_defs.sh to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipsec-secgw/test/tun_null_header_reconstruct.py to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipsec-secgw/test 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipv4_multicast/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/ipv4_multicast/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ipv4_multicast 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-cat/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-cat/cat.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-cat/cat.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-cat/l2fwd-cat.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-cat 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-crypto/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-crypto/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-crypto 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_common.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_common.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_event.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_event.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:49.785 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_event_generic.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_event_internal_port.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_poll.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/l2fwd_poll.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-event/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-event 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-jobstats/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-jobstats/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-jobstats 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-keepalive/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-keepalive/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-keepalive/shm.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-keepalive/shm.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-keepalive 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-keepalive/ka-agent/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-keepalive/ka-agent/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-keepalive/ka-agent 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-macsec/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd-macsec/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd-macsec 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l2fwd/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l2fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd-graph/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd-graph/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd-graph 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd-power/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd-power/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd-power/main.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd-power/perf_core.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd-power/perf_core.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd-power 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/em_default_v4.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/em_default_v6.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/em_route_parse.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_acl.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_acl.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_acl_scalar.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_altivec.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_common.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_em.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_em.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_em_hlm.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_em_hlm_neon.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_em_hlm_sse.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_em_sequential.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_event.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_event.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_event_generic.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_event_internal_port.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_fib.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_lpm.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_lpm.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_lpm_altivec.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_lpm_neon.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.047 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_lpm_sse.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_neon.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_route.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/l3fwd_sse.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/lpm_default_v4.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/lpm_default_v6.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/lpm_route_parse.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/l3fwd/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/l3fwd 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/link_status_interrupt/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/link_status_interrupt/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/link_status_interrupt 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_client/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_client/client.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_client 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_server/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_server/args.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_server/args.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_server/init.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_server/init.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/mp_server/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/mp_server 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/client_server_mp/shared/common.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/client_server_mp/shared 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/hotplug_mp/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/hotplug_mp/commands.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/hotplug_mp/commands.list to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/hotplug_mp/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/hotplug_mp 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/simple_mp/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/simple_mp/commands.list to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/simple_mp/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/simple_mp/mp_commands.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/simple_mp/mp_commands.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/simple_mp 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/symmetric_mp/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/multi_process/symmetric_mp/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/multi_process/symmetric_mp 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/ntb/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ntb 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/ntb/commands.list to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ntb 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/ntb/ntb_fwd.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ntb 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/packet_ordering/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/packet_ordering 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/packet_ordering/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/packet_ordering 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/cli.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/cli.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/conn.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/conn.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/obj.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/obj.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/thread.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/thread.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/ethdev.io to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/fib.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/fib.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/fib_nexthop_group_table.txt to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/fib_nexthop_table.txt to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/fib_routing_table.txt to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/hash_func.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/hash_func.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/ipsec.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/ipsec.io to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/ipsec.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/ipsec_sa.txt to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/l2fwd.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/l2fwd.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/l2fwd_macswp.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/l2fwd_macswp.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/l2fwd_macswp_pcap.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/l2fwd_pcap.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/learner.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/learner.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/meter.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/meter.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/mirroring.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/mirroring.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/packet.txt to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/pcap.io to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/recirculation.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/recirculation.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/registers.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.048 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/registers.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/rss.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/rss.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/selector.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/selector.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/selector.txt to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/varbit.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/varbit.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/vxlan.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/vxlan.spec to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/vxlan_pcap.cli to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/vxlan_table.py to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/pipeline/examples/vxlan_table.txt to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/pipeline/examples 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/ptpclient/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ptpclient 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/ptpclient/ptpclient.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/ptpclient 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_meter/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_meter 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_meter/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_meter 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_meter/main.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_meter 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_meter/rte_policer.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_meter 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_meter/rte_policer.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_meter 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/app_thread.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/args.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/cfg_file.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/cfg_file.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/cmdline.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/init.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/main.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/profile.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/profile_ov.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/profile_pie.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/profile_red.cfg to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/qos_sched/stats.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/qos_sched 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/rxtx_callbacks/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/rxtx_callbacks/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/rxtx_callbacks 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/efd_node/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/efd_node/node.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/efd_node 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/efd_server/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/efd_server/args.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/efd_server/args.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/efd_server/init.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/efd_server/init.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/efd_server/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/efd_server 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/server_node_efd/shared/common.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/server_node_efd/shared 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/service_cores/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/service_cores 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/service_cores/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/service_cores 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/skeleton/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/skeleton 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/skeleton/basicfwd.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/skeleton 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/timer/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/timer 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/timer/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/timer 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vdpa/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vdpa 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vdpa/commands.list to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vdpa 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vdpa/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vdpa 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vdpa/vdpa_blk_compact.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vdpa 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost/main.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost/virtio_net.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_blk/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_blk 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_blk/blk.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_blk 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_blk/blk_spec.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_blk 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_blk/vhost_blk.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_blk 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_blk/vhost_blk.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_blk 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_blk/vhost_blk_compat.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_blk 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_crypto/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vhost_crypto/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vhost_crypto 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/channel_manager.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/channel_manager.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/channel_monitor.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/channel_monitor.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/oob_monitor.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/oob_monitor_nop.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/oob_monitor_x86.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/parse.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/parse.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/power_manager.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/power_manager.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.049 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/vm_power_cli.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.050 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/vm_power_cli.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager 00:02:50.050 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/guest_cli/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:50.050 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/guest_cli/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:50.050 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/guest_cli/parse.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:50.050 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/guest_cli/parse.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:50.050 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:50.050 Installing /home/vagrant/spdk_repo/dpdk/examples/vm_power_manager/guest_cli/vm_power_cli_guest.h to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vm_power_manager/guest_cli 00:02:50.050 Installing /home/vagrant/spdk_repo/dpdk/examples/vmdq/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vmdq 00:02:50.050 Installing /home/vagrant/spdk_repo/dpdk/examples/vmdq/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vmdq 00:02:50.050 Installing /home/vagrant/spdk_repo/dpdk/examples/vmdq_dcb/Makefile to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:50.050 Installing /home/vagrant/spdk_repo/dpdk/examples/vmdq_dcb/main.c to /home/vagrant/spdk_repo/dpdk/build/share/dpdk/examples/vmdq_dcb 00:02:50.050 Installing lib/librte_log.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_log.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_kvargs.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_kvargs.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_telemetry.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_telemetry.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_eal.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_eal.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_ring.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_ring.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_rcu.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_rcu.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_mempool.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_mempool.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_mbuf.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_mbuf.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_net.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_net.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_meter.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_meter.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_ethdev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_ethdev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_pci.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_pci.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_cmdline.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_cmdline.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_metrics.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_metrics.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_hash.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_hash.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_timer.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_timer.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_acl.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_acl.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_bbdev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_bbdev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_bitratestats.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_bitratestats.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_bpf.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_bpf.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_cfgfile.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_cfgfile.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_compressdev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_compressdev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_cryptodev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_cryptodev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_distributor.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_distributor.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_dmadev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_dmadev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_efd.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_efd.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_eventdev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_eventdev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_dispatcher.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_dispatcher.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_gpudev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_gpudev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_gro.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_gro.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_gso.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_gso.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_ip_frag.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_ip_frag.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_jobstats.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_jobstats.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_latencystats.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_latencystats.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_lpm.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_lpm.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_member.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_member.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_pcapng.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_pcapng.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_power.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_power.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_rawdev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_rawdev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_regexdev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_regexdev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_mldev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_mldev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_rib.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_rib.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_reorder.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_reorder.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_sched.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_sched.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_security.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_security.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_stack.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_stack.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_vhost.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_vhost.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_ipsec.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_ipsec.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_pdcp.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_pdcp.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_fib.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_fib.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_port.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_port.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_pdump.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_pdump.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_table.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_table.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.050 Installing lib/librte_pipeline.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.051 Installing lib/librte_pipeline.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.051 Installing lib/librte_graph.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.051 Installing lib/librte_graph.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.313 Installing lib/librte_node.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.313 Installing lib/librte_node.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.313 Installing drivers/librte_bus_pci.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.313 Installing drivers/librte_bus_pci.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0 00:02:50.313 Installing drivers/librte_bus_vdev.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.313 Installing drivers/librte_bus_vdev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0 00:02:50.313 Installing drivers/librte_mempool_ring.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.313 Installing drivers/librte_mempool_ring.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0 00:02:50.313 Installing drivers/librte_net_i40e.a to /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.313 Installing drivers/librte_net_i40e.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0 00:02:50.313 Installing app/dpdk-dumpcap to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-graph to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-pdump to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-proc-info to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-acl to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-bbdev to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-cmdline to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-compress-perf to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-crypto-perf to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-dma-perf to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-eventdev to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-fib to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-flow-perf to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-gpudev to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-mldev to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-pipeline to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-testpmd to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-regex to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-sad to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing app/dpdk-test-security-perf to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/config/rte_config.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/log/rte_log.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/kvargs/rte_kvargs.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/telemetry/rte_telemetry.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_atomic.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_byteorder.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_cpuflags.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_cycles.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_io.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_memcpy.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_pause.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_power_intrinsics.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_prefetch.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_rwlock.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_spinlock.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/generic/rte_vect.h to /home/vagrant/spdk_repo/dpdk/build/include/generic 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_atomic.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_byteorder.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_cpuflags.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_cycles.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_io.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_memcpy.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_pause.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_power_intrinsics.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_prefetch.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_rtm.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_rwlock.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_spinlock.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_vect.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_atomic_32.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_atomic_64.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_byteorder_32.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/x86/include/rte_byteorder_64.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_alarm.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_bitmap.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_bitops.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_branch_prediction.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_bus.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_class.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_common.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_compat.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_debug.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_dev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_devargs.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_eal.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_eal_memconfig.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_eal_trace.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.313 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_errno.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_epoll.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_fbarray.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_hexdump.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_hypervisor.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_interrupts.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_keepalive.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_launch.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_lcore.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_lock_annotations.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_malloc.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_mcslock.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_memory.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_memzone.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_pci_dev_feature_defs.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_pci_dev_features.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_per_lcore.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_pflock.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_random.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_reciprocal.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_seqcount.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_seqlock.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_service.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_service_component.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_stdatomic.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_string_fns.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_tailq.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_thread.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_ticketlock.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_time.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_trace.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_trace_point.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_trace_point_register.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_uuid.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_version.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/include/rte_vfio.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/eal/linux/include/rte_os.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_elem.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_elem_pvt.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_c11_pvt.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_generic_pvt.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_hts.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_hts_elem_pvt.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_peek.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_peek_elem_pvt.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_peek_zc.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_rts.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ring/rte_ring_rts_elem_pvt.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/rcu/rte_rcu_qsbr.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/mempool/rte_mempool.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/mempool/rte_mempool_trace_fp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/mbuf/rte_mbuf.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/mbuf/rte_mbuf_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/mbuf/rte_mbuf_ptype.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/mbuf/rte_mbuf_pool_ops.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/mbuf/rte_mbuf_dyn.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_ip.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_tcp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_udp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_tls.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_dtls.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_esp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_sctp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_icmp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_arp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_ether.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_macsec.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_vxlan.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_gre.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_gtp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_net.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_net_crc.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_mpls.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_higig.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_ecpri.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_pdcp_hdr.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_geneve.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_l2tpv2.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_ppp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/net/rte_ib.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/meter/rte_meter.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_cman.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_ethdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_ethdev_trace_fp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_dev_info.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_flow.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_flow_driver.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_mtr.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_mtr_driver.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_tm.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_tm_driver.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_ethdev_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/ethdev/rte_eth_ctrl.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/pci/rte_pci.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_parse.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.314 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_parse_num.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_parse_ipaddr.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_parse_etheraddr.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_parse_string.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_rdline.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_vt100.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_socket.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_cirbuf.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cmdline/cmdline_parse_portlist.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/metrics/rte_metrics.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/metrics/rte_metrics_telemetry.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_fbk_hash.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_hash_crc.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_hash.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_jhash.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_thash.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_thash_gfni.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_crc_arm64.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_crc_generic.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_crc_sw.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_crc_x86.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/hash/rte_thash_x86_gfni.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/timer/rte_timer.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/acl/rte_acl.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/acl/rte_acl_osdep.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/bbdev/rte_bbdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/bbdev/rte_bbdev_pmd.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/bbdev/rte_bbdev_op.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/bitratestats/rte_bitrate.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/bpf/bpf_def.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/bpf/rte_bpf.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/bpf/rte_bpf_ethdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cfgfile/rte_cfgfile.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/compressdev/rte_compressdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/compressdev/rte_comp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cryptodev/rte_cryptodev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cryptodev/rte_cryptodev_trace_fp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cryptodev/rte_crypto.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cryptodev/rte_crypto_sym.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cryptodev/rte_crypto_asym.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/cryptodev/rte_cryptodev_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/distributor/rte_distributor.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/dmadev/rte_dmadev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/dmadev/rte_dmadev_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/efd/rte_efd.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_event_crypto_adapter.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_event_dma_adapter.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_event_eth_rx_adapter.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_event_eth_tx_adapter.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_event_ring.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_event_timer_adapter.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_eventdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_eventdev_trace_fp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/eventdev/rte_eventdev_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/dispatcher/rte_dispatcher.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/gpudev/rte_gpudev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/gro/rte_gro.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/gso/rte_gso.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/ip_frag/rte_ip_frag.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/jobstats/rte_jobstats.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/latencystats/rte_latencystats.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/lpm/rte_lpm.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/lpm/rte_lpm6.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/lpm/rte_lpm_altivec.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/lpm/rte_lpm_neon.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/lpm/rte_lpm_scalar.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/lpm/rte_lpm_sse.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/lpm/rte_lpm_sve.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/member/rte_member.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/pcapng/rte_pcapng.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/power/rte_power.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/power/rte_power_guest_channel.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/power/rte_power_pmd_mgmt.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/power/rte_power_uncore.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/rawdev/rte_rawdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/rawdev/rte_rawdev_pmd.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/regexdev/rte_regexdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/regexdev/rte_regexdev_driver.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/regexdev/rte_regexdev_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/mldev/rte_mldev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/mldev/rte_mldev_core.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/rib/rte_rib.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/rib/rte_rib6.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/reorder/rte_reorder.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/sched/rte_approx.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/sched/rte_red.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/sched/rte_sched.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/sched/rte_sched_common.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/sched/rte_pie.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/security/rte_security.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/security/rte_security_driver.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/stack/rte_stack.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/stack/rte_stack_std.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/stack/rte_stack_lf.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/stack/rte_stack_lf_generic.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/stack/rte_stack_lf_c11.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/stack/rte_stack_lf_stubs.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.315 Installing /home/vagrant/spdk_repo/dpdk/lib/vhost/rte_vdpa.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/vhost/rte_vhost.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/vhost/rte_vhost_async.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/vhost/rte_vhost_crypto.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/ipsec/rte_ipsec.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/ipsec/rte_ipsec_sa.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/ipsec/rte_ipsec_sad.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/ipsec/rte_ipsec_group.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/pdcp/rte_pdcp.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/pdcp/rte_pdcp_group.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/fib/rte_fib.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/fib/rte_fib6.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_ethdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_fd.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_frag.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_ras.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_ring.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_sched.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_source_sink.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_sym_crypto.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_port_eventdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_swx_port.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_swx_port_ethdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_swx_port_fd.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_swx_port_ring.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/port/rte_swx_port_source_sink.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/pdump/rte_pdump.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_lru.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_swx_hash_func.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_swx_table.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_swx_table_em.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_swx_table_learner.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_swx_table_selector.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_swx_table_wm.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_acl.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_array.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_hash.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_hash_cuckoo.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_hash_func.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_lpm.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_lpm_ipv6.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_stub.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_lru_arm64.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_lru_x86.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/table/rte_table_hash_func_arm64.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/pipeline/rte_pipeline.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/pipeline/rte_port_in_action.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/pipeline/rte_table_action.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/pipeline/rte_swx_ipsec.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/pipeline/rte_swx_pipeline.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/pipeline/rte_swx_extern.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/pipeline/rte_swx_ctl.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/graph/rte_graph.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/graph/rte_graph_worker.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/graph/rte_graph_model_mcore_dispatch.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/graph/rte_graph_model_rtc.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/graph/rte_graph_worker_common.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/node/rte_node_eth_api.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/node/rte_node_ip4_api.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/node/rte_node_ip6_api.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/lib/node/rte_node_udp4_input_api.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/drivers/bus/pci/rte_bus_pci.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/drivers/bus/vdev/rte_bus_vdev.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/drivers/net/i40e/rte_pmd_i40e.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/buildtools/dpdk-cmdline-gen.py to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/usertools/dpdk-devbind.py to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/usertools/dpdk-pmdinfo.py to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/usertools/dpdk-telemetry.py to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/usertools/dpdk-hugepages.py to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/usertools/dpdk-rss-flows.py to /home/vagrant/spdk_repo/dpdk/build/bin 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/build-tmp/rte_build_config.h to /home/vagrant/spdk_repo/dpdk/build/include 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/build-tmp/meson-private/libdpdk-libs.pc to /home/vagrant/spdk_repo/dpdk/build/lib/pkgconfig 00:02:50.316 Installing /home/vagrant/spdk_repo/dpdk/build-tmp/meson-private/libdpdk.pc to /home/vagrant/spdk_repo/dpdk/build/lib/pkgconfig 00:02:50.316 Installing symlink pointing to librte_log.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_log.so.24 00:02:50.316 Installing symlink pointing to librte_log.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_log.so 00:02:50.316 Installing symlink pointing to librte_kvargs.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_kvargs.so.24 00:02:50.316 Installing symlink pointing to librte_kvargs.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_kvargs.so 00:02:50.316 Installing symlink pointing to librte_telemetry.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_telemetry.so.24 00:02:50.316 Installing symlink pointing to librte_telemetry.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_telemetry.so 00:02:50.316 Installing symlink pointing to librte_eal.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_eal.so.24 00:02:50.316 Installing symlink pointing to librte_eal.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_eal.so 00:02:50.316 Installing symlink pointing to librte_ring.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ring.so.24 00:02:50.316 Installing symlink pointing to librte_ring.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ring.so 00:02:50.316 Installing symlink pointing to librte_rcu.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_rcu.so.24 00:02:50.316 Installing symlink pointing to librte_rcu.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_rcu.so 00:02:50.316 Installing symlink pointing to librte_mempool.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_mempool.so.24 00:02:50.316 Installing symlink pointing to librte_mempool.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_mempool.so 00:02:50.316 Installing symlink pointing to librte_mbuf.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_mbuf.so.24 00:02:50.316 Installing symlink pointing to librte_mbuf.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_mbuf.so 00:02:50.316 Installing symlink pointing to librte_net.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_net.so.24 00:02:50.316 Installing symlink pointing to librte_net.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_net.so 00:02:50.316 Installing symlink pointing to librte_meter.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_meter.so.24 00:02:50.316 Installing symlink pointing to librte_meter.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_meter.so 00:02:50.316 Installing symlink pointing to librte_ethdev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ethdev.so.24 00:02:50.317 Installing symlink pointing to librte_ethdev.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ethdev.so 00:02:50.317 Installing symlink pointing to librte_pci.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pci.so.24 00:02:50.317 Installing symlink pointing to librte_pci.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pci.so 00:02:50.317 Installing symlink pointing to librte_cmdline.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_cmdline.so.24 00:02:50.317 Installing symlink pointing to librte_cmdline.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_cmdline.so 00:02:50.317 Installing symlink pointing to librte_metrics.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_metrics.so.24 00:02:50.317 Installing symlink pointing to librte_metrics.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_metrics.so 00:02:50.317 Installing symlink pointing to librte_hash.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_hash.so.24 00:02:50.317 Installing symlink pointing to librte_hash.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_hash.so 00:02:50.317 Installing symlink pointing to librte_timer.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_timer.so.24 00:02:50.317 Installing symlink pointing to librte_timer.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_timer.so 00:02:50.317 Installing symlink pointing to librte_acl.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_acl.so.24 00:02:50.317 Installing symlink pointing to librte_acl.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_acl.so 00:02:50.317 Installing symlink pointing to librte_bbdev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_bbdev.so.24 00:02:50.317 Installing symlink pointing to librte_bbdev.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_bbdev.so 00:02:50.317 Installing symlink pointing to librte_bitratestats.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_bitratestats.so.24 00:02:50.317 Installing symlink pointing to librte_bitratestats.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_bitratestats.so 00:02:50.317 Installing symlink pointing to librte_bpf.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_bpf.so.24 00:02:50.317 Installing symlink pointing to librte_bpf.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_bpf.so 00:02:50.317 Installing symlink pointing to librte_cfgfile.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_cfgfile.so.24 00:02:50.317 Installing symlink pointing to librte_cfgfile.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_cfgfile.so 00:02:50.317 Installing symlink pointing to librte_compressdev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_compressdev.so.24 00:02:50.317 Installing symlink pointing to librte_compressdev.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_compressdev.so 00:02:50.317 Installing symlink pointing to librte_cryptodev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_cryptodev.so.24 00:02:50.317 Installing symlink pointing to librte_cryptodev.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_cryptodev.so 00:02:50.317 Installing symlink pointing to librte_distributor.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_distributor.so.24 00:02:50.317 Installing symlink pointing to librte_distributor.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_distributor.so 00:02:50.317 Installing symlink pointing to librte_dmadev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_dmadev.so.24 00:02:50.317 Installing symlink pointing to librte_dmadev.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_dmadev.so 00:02:50.317 Installing symlink pointing to librte_efd.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_efd.so.24 00:02:50.317 Installing symlink pointing to librte_efd.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_efd.so 00:02:50.317 Installing symlink pointing to librte_eventdev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_eventdev.so.24 00:02:50.317 Installing symlink pointing to librte_eventdev.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_eventdev.so 00:02:50.317 Installing symlink pointing to librte_dispatcher.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_dispatcher.so.24 00:02:50.317 Installing symlink pointing to librte_dispatcher.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_dispatcher.so 00:02:50.317 Installing symlink pointing to librte_gpudev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_gpudev.so.24 00:02:50.317 Installing symlink pointing to librte_gpudev.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_gpudev.so 00:02:50.317 Installing symlink pointing to librte_gro.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_gro.so.24 00:02:50.317 Installing symlink pointing to librte_gro.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_gro.so 00:02:50.317 Installing symlink pointing to librte_gso.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_gso.so.24 00:02:50.317 Installing symlink pointing to librte_gso.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_gso.so 00:02:50.317 Installing symlink pointing to librte_ip_frag.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ip_frag.so.24 00:02:50.317 Installing symlink pointing to librte_ip_frag.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ip_frag.so 00:02:50.317 Installing symlink pointing to librte_jobstats.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_jobstats.so.24 00:02:50.317 Installing symlink pointing to librte_jobstats.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_jobstats.so 00:02:50.317 Installing symlink pointing to librte_latencystats.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_latencystats.so.24 00:02:50.317 Installing symlink pointing to librte_latencystats.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_latencystats.so 00:02:50.317 Installing symlink pointing to librte_lpm.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_lpm.so.24 00:02:50.317 Installing symlink pointing to librte_lpm.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_lpm.so 00:02:50.317 Installing symlink pointing to librte_member.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_member.so.24 00:02:50.317 Installing symlink pointing to librte_member.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_member.so 00:02:50.317 Installing symlink pointing to librte_pcapng.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pcapng.so.24 00:02:50.317 Installing symlink pointing to librte_pcapng.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pcapng.so 00:02:50.317 Installing symlink pointing to librte_power.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_power.so.24 00:02:50.317 Installing symlink pointing to librte_power.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_power.so 00:02:50.317 Installing symlink pointing to librte_rawdev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_rawdev.so.24 00:02:50.317 Installing symlink pointing to librte_rawdev.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_rawdev.so 00:02:50.317 Installing symlink pointing to librte_regexdev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_regexdev.so.24 00:02:50.317 Installing symlink pointing to librte_regexdev.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_regexdev.so 00:02:50.317 Installing symlink pointing to librte_mldev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_mldev.so.24 00:02:50.317 Installing symlink pointing to librte_mldev.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_mldev.so 00:02:50.317 Installing symlink pointing to librte_rib.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_rib.so.24 00:02:50.317 Installing symlink pointing to librte_rib.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_rib.so 00:02:50.317 Installing symlink pointing to librte_reorder.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_reorder.so.24 00:02:50.317 Installing symlink pointing to librte_reorder.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_reorder.so 00:02:50.317 Installing symlink pointing to librte_sched.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_sched.so.24 00:02:50.317 Installing symlink pointing to librte_sched.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_sched.so 00:02:50.317 Installing symlink pointing to librte_security.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_security.so.24 00:02:50.317 Installing symlink pointing to librte_security.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_security.so 00:02:50.317 './librte_bus_pci.so' -> 'dpdk/pmds-24.0/librte_bus_pci.so' 00:02:50.317 './librte_bus_pci.so.24' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24' 00:02:50.317 './librte_bus_pci.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_pci.so.24.0' 00:02:50.317 './librte_bus_vdev.so' -> 'dpdk/pmds-24.0/librte_bus_vdev.so' 00:02:50.317 './librte_bus_vdev.so.24' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24' 00:02:50.317 './librte_bus_vdev.so.24.0' -> 'dpdk/pmds-24.0/librte_bus_vdev.so.24.0' 00:02:50.317 './librte_mempool_ring.so' -> 'dpdk/pmds-24.0/librte_mempool_ring.so' 00:02:50.317 './librte_mempool_ring.so.24' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24' 00:02:50.317 './librte_mempool_ring.so.24.0' -> 'dpdk/pmds-24.0/librte_mempool_ring.so.24.0' 00:02:50.317 './librte_net_i40e.so' -> 'dpdk/pmds-24.0/librte_net_i40e.so' 00:02:50.317 './librte_net_i40e.so.24' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24' 00:02:50.317 './librte_net_i40e.so.24.0' -> 'dpdk/pmds-24.0/librte_net_i40e.so.24.0' 00:02:50.317 Installing symlink pointing to librte_stack.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_stack.so.24 00:02:50.317 Installing symlink pointing to librte_stack.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_stack.so 00:02:50.317 Installing symlink pointing to librte_vhost.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_vhost.so.24 00:02:50.317 Installing symlink pointing to librte_vhost.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_vhost.so 00:02:50.317 Installing symlink pointing to librte_ipsec.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ipsec.so.24 00:02:50.317 Installing symlink pointing to librte_ipsec.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_ipsec.so 00:02:50.317 Installing symlink pointing to librte_pdcp.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pdcp.so.24 00:02:50.317 Installing symlink pointing to librte_pdcp.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pdcp.so 00:02:50.317 Installing symlink pointing to librte_fib.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_fib.so.24 00:02:50.317 Installing symlink pointing to librte_fib.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_fib.so 00:02:50.317 Installing symlink pointing to librte_port.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_port.so.24 00:02:50.317 Installing symlink pointing to librte_port.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_port.so 00:02:50.317 Installing symlink pointing to librte_pdump.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pdump.so.24 00:02:50.317 Installing symlink pointing to librte_pdump.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pdump.so 00:02:50.317 Installing symlink pointing to librte_table.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_table.so.24 00:02:50.317 Installing symlink pointing to librte_table.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_table.so 00:02:50.317 Installing symlink pointing to librte_pipeline.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pipeline.so.24 00:02:50.317 Installing symlink pointing to librte_pipeline.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_pipeline.so 00:02:50.317 Installing symlink pointing to librte_graph.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_graph.so.24 00:02:50.318 Installing symlink pointing to librte_graph.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_graph.so 00:02:50.318 Installing symlink pointing to librte_node.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_node.so.24 00:02:50.318 Installing symlink pointing to librte_node.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/librte_node.so 00:02:50.318 Installing symlink pointing to librte_bus_pci.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24 00:02:50.318 Installing symlink pointing to librte_bus_pci.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:02:50.318 Installing symlink pointing to librte_bus_vdev.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24 00:02:50.318 Installing symlink pointing to librte_bus_vdev.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:02:50.318 Installing symlink pointing to librte_mempool_ring.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24 00:02:50.318 Installing symlink pointing to librte_mempool_ring.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:02:50.318 Installing symlink pointing to librte_net_i40e.so.24.0 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24 00:02:50.318 Installing symlink pointing to librte_net_i40e.so.24 to /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:02:50.318 Running custom install script '/bin/sh /home/vagrant/spdk_repo/dpdk/config/../buildtools/symlink-drivers-solibs.sh lib dpdk/pmds-24.0' 00:02:50.318 04:53:10 build_native_dpdk -- common/autobuild_common.sh@220 -- $ cat 00:02:50.318 ************************************ 00:02:50.318 END TEST build_native_dpdk 00:02:50.318 ************************************ 00:02:50.318 04:53:10 build_native_dpdk -- common/autobuild_common.sh@225 -- $ cd /home/vagrant/spdk_repo/spdk 00:02:50.318 00:02:50.318 real 0m38.123s 00:02:50.318 user 4m24.598s 00:02:50.318 sys 0m37.940s 00:02:50.318 04:53:10 build_native_dpdk -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:02:50.318 04:53:10 build_native_dpdk -- common/autotest_common.sh@10 -- $ set +x 00:02:50.318 04:53:10 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:02:50.318 04:53:10 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:02:50.318 04:53:10 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:02:50.318 04:53:10 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:02:50.318 04:53:10 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:02:50.318 04:53:10 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:02:50.318 04:53:10 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:02:50.318 04:53:10 -- spdk/autobuild.sh@67 -- $ /home/vagrant/spdk_repo/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --enable-ubsan --enable-asan --enable-coverage --with-ublk --with-dpdk=/home/vagrant/spdk_repo/dpdk/build --with-xnvme --with-shared 00:02:50.578 Using /home/vagrant/spdk_repo/dpdk/build/lib/pkgconfig for additional libs... 00:02:50.578 DPDK libraries: /home/vagrant/spdk_repo/dpdk/build/lib 00:02:50.578 DPDK includes: //home/vagrant/spdk_repo/dpdk/build/include 00:02:50.578 Using default SPDK env in /home/vagrant/spdk_repo/spdk/lib/env_dpdk 00:02:50.840 Using 'verbs' RDMA provider 00:03:02.284 Configuring ISA-L (logfile: /home/vagrant/spdk_repo/spdk/.spdk-isal.log)...done. 00:03:12.270 Configuring ISA-L-crypto (logfile: /home/vagrant/spdk_repo/spdk/.spdk-isal-crypto.log)...done. 00:03:12.270 Creating mk/config.mk...done. 00:03:12.270 Creating mk/cc.flags.mk...done. 00:03:12.270 Type 'make' to build. 00:03:12.270 04:53:32 -- spdk/autobuild.sh@70 -- $ run_test make make -j10 00:03:12.270 04:53:32 -- common/autotest_common.sh@1105 -- $ '[' 3 -le 1 ']' 00:03:12.270 04:53:32 -- common/autotest_common.sh@1111 -- $ xtrace_disable 00:03:12.270 04:53:32 -- common/autotest_common.sh@10 -- $ set +x 00:03:12.270 ************************************ 00:03:12.270 START TEST make 00:03:12.270 ************************************ 00:03:12.270 04:53:32 make -- common/autotest_common.sh@1129 -- $ make -j10 00:03:12.270 (cd /home/vagrant/spdk_repo/spdk/xnvme && \ 00:03:12.270 export PKG_CONFIG_PATH=$PKG_CONFIG_PATH:/usr/lib/pkgconfig:/usr/lib64/pkgconfig && \ 00:03:12.270 meson setup builddir \ 00:03:12.270 -Dwith-libaio=enabled \ 00:03:12.270 -Dwith-liburing=enabled \ 00:03:12.270 -Dwith-libvfn=disabled \ 00:03:12.270 -Dwith-spdk=disabled \ 00:03:12.270 -Dexamples=false \ 00:03:12.270 -Dtests=false \ 00:03:12.270 -Dtools=false && \ 00:03:12.270 meson compile -C builddir && \ 00:03:12.270 cd -) 00:03:14.170 The Meson build system 00:03:14.170 Version: 1.5.0 00:03:14.170 Source dir: /home/vagrant/spdk_repo/spdk/xnvme 00:03:14.170 Build dir: /home/vagrant/spdk_repo/spdk/xnvme/builddir 00:03:14.170 Build type: native build 00:03:14.170 Project name: xnvme 00:03:14.170 Project version: 0.7.5 00:03:14.170 C compiler for the host machine: gcc (gcc 13.3.1 "gcc (GCC) 13.3.1 20240522 (Red Hat 13.3.1-1)") 00:03:14.170 C linker for the host machine: gcc ld.bfd 2.40-14 00:03:14.170 Host machine cpu family: x86_64 00:03:14.170 Host machine cpu: x86_64 00:03:14.170 Message: host_machine.system: linux 00:03:14.170 Compiler for C supports arguments -Wno-missing-braces: YES 00:03:14.170 Compiler for C supports arguments -Wno-cast-function-type: YES 00:03:14.170 Compiler for C supports arguments -Wno-strict-aliasing: YES 00:03:14.170 Run-time dependency threads found: YES 00:03:14.170 Has header "setupapi.h" : NO 00:03:14.170 Has header "linux/blkzoned.h" : YES 00:03:14.170 Has header "linux/blkzoned.h" : YES (cached) 00:03:14.170 Has header "libaio.h" : YES 00:03:14.170 Library aio found: YES 00:03:14.170 Found pkg-config: YES (/usr/bin/pkg-config) 1.9.5 00:03:14.170 Run-time dependency liburing found: YES 2.2 00:03:14.170 Dependency libvfn skipped: feature with-libvfn disabled 00:03:14.170 Found CMake: /usr/bin/cmake (3.27.7) 00:03:14.170 Run-time dependency libisal found: NO (tried pkgconfig and cmake) 00:03:14.170 Subproject spdk : skipped: feature with-spdk disabled 00:03:14.170 Run-time dependency appleframeworks found: NO (tried framework) 00:03:14.170 Run-time dependency appleframeworks found: NO (tried framework) 00:03:14.170 Library rt found: YES 00:03:14.170 Checking for function "clock_gettime" with dependency -lrt: YES 00:03:14.170 Configuring xnvme_config.h using configuration 00:03:14.170 Configuring xnvme.spec using configuration 00:03:14.170 Run-time dependency bash-completion found: YES 2.11 00:03:14.170 Message: Bash-completions: /usr/share/bash-completion/completions 00:03:14.170 Program cp found: YES (/usr/bin/cp) 00:03:14.170 Build targets in project: 3 00:03:14.170 00:03:14.170 xnvme 0.7.5 00:03:14.170 00:03:14.170 Subprojects 00:03:14.170 spdk : NO Feature 'with-spdk' disabled 00:03:14.170 00:03:14.170 User defined options 00:03:14.170 examples : false 00:03:14.170 tests : false 00:03:14.170 tools : false 00:03:14.170 with-libaio : enabled 00:03:14.170 with-liburing: enabled 00:03:14.170 with-libvfn : disabled 00:03:14.170 with-spdk : disabled 00:03:14.170 00:03:14.170 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:03:14.428 ninja: Entering directory `/home/vagrant/spdk_repo/spdk/xnvme/builddir' 00:03:14.685 [1/76] Generating toolbox/xnvme-driver-script with a custom command 00:03:14.685 [2/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_fbsd.c.o 00:03:14.685 [3/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_fbsd_dev.c.o 00:03:14.685 [4/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_fbsd_async.c.o 00:03:14.686 [5/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_cbi_admin_shim.c.o 00:03:14.686 [6/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_cbi_async_emu.c.o 00:03:14.686 [7/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_cbi_async_nil.c.o 00:03:14.686 [8/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_cbi_mem_posix.c.o 00:03:14.686 [9/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_fbsd_nvme.c.o 00:03:14.686 [10/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_adm.c.o 00:03:14.686 [11/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_cbi_sync_psync.c.o 00:03:14.686 [12/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_cbi_async_posix.c.o 00:03:14.686 [13/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_linux.c.o 00:03:14.686 [14/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_macos_admin.c.o 00:03:14.686 [15/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_macos.c.o 00:03:14.686 [16/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be.c.o 00:03:14.686 [17/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_macos_dev.c.o 00:03:14.686 [18/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_cbi_async_thrpool.c.o 00:03:14.686 [19/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_macos_sync.c.o 00:03:14.686 [20/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_linux_async_libaio.c.o 00:03:14.686 [21/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_linux_hugepage.c.o 00:03:14.944 [22/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_ramdisk.c.o 00:03:14.944 [23/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_linux_async_ucmd.c.o 00:03:14.944 [24/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_linux_block.c.o 00:03:14.944 [25/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_linux_nvme.c.o 00:03:14.944 [26/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_ramdisk_admin.c.o 00:03:14.944 [27/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_nosys.c.o 00:03:14.944 [28/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_spdk.c.o 00:03:14.944 [29/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_ramdisk_dev.c.o 00:03:14.944 [30/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_spdk_admin.c.o 00:03:14.944 [31/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_linux_dev.c.o 00:03:14.944 [32/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_spdk_async.c.o 00:03:14.944 [33/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_ramdisk_sync.c.o 00:03:14.944 [34/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_spdk_dev.c.o 00:03:14.944 [35/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_linux_async_liburing.c.o 00:03:14.944 [36/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_spdk_mem.c.o 00:03:14.944 [37/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_vfio.c.o 00:03:14.944 [38/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_vfio_admin.c.o 00:03:14.944 [39/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_spdk_sync.c.o 00:03:14.944 [40/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_vfio_async.c.o 00:03:14.944 [41/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_windows.c.o 00:03:14.944 [42/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_vfio_dev.c.o 00:03:14.944 [43/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_vfio_mem.c.o 00:03:14.944 [44/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_windows_async_iocp.c.o 00:03:14.944 [45/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_vfio_sync.c.o 00:03:14.944 [46/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_windows_async_iocp_th.c.o 00:03:14.944 [47/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_windows_async_ioring.c.o 00:03:14.944 [48/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_windows_dev.c.o 00:03:14.944 [49/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_windows_mem.c.o 00:03:14.944 [50/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_windows_block.c.o 00:03:14.944 [51/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_windows_nvme.c.o 00:03:14.944 [52/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_be_windows_fs.c.o 00:03:14.944 [53/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_libconf_entries.c.o 00:03:14.944 [54/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_file.c.o 00:03:14.944 [55/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_cmd.c.o 00:03:14.944 [56/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_req.c.o 00:03:15.200 [57/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_libconf.c.o 00:03:15.200 [58/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_geo.c.o 00:03:15.200 [59/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_ident.c.o 00:03:15.200 [60/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_lba.c.o 00:03:15.200 [61/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_buf.c.o 00:03:15.200 [62/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_nvm.c.o 00:03:15.200 [63/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_opts.c.o 00:03:15.200 [64/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_queue.c.o 00:03:15.200 [65/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_ver.c.o 00:03:15.200 [66/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_topology.c.o 00:03:15.200 [67/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_dev.c.o 00:03:15.200 [68/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_crc.c.o 00:03:15.200 [69/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_kvs.c.o 00:03:15.200 [70/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_pi.c.o 00:03:15.200 [71/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_spec_pp.c.o 00:03:15.200 [72/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_znd.c.o 00:03:15.457 [73/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_cli.c.o 00:03:15.714 [74/76] Compiling C object lib/libxnvme.so.0.7.5.p/xnvme_spec.c.o 00:03:15.714 [75/76] Linking static target lib/libxnvme.a 00:03:15.714 [76/76] Linking target lib/libxnvme.so.0.7.5 00:03:15.714 INFO: autodetecting backend as ninja 00:03:15.714 INFO: calculating backend command to run: /usr/local/bin/ninja -C /home/vagrant/spdk_repo/spdk/xnvme/builddir 00:03:15.714 /home/vagrant/spdk_repo/spdk/xnvmebuild 00:03:47.880 CC lib/ut_mock/mock.o 00:03:47.880 CC lib/log/log.o 00:03:47.880 CC lib/log/log_flags.o 00:03:47.880 CC lib/log/log_deprecated.o 00:03:47.880 CC lib/ut/ut.o 00:03:47.880 LIB libspdk_log.a 00:03:47.880 LIB libspdk_ut.a 00:03:47.880 LIB libspdk_ut_mock.a 00:03:47.880 SO libspdk_log.so.7.1 00:03:47.880 SO libspdk_ut.so.2.0 00:03:47.880 SO libspdk_ut_mock.so.6.0 00:03:47.880 SYMLINK libspdk_ut.so 00:03:47.880 SYMLINK libspdk_ut_mock.so 00:03:47.880 SYMLINK libspdk_log.so 00:03:47.880 CC lib/ioat/ioat.o 00:03:47.880 CXX lib/trace_parser/trace.o 00:03:47.880 CC lib/dma/dma.o 00:03:47.880 CC lib/util/base64.o 00:03:47.880 CC lib/util/bit_array.o 00:03:47.880 CC lib/util/crc16.o 00:03:47.880 CC lib/util/cpuset.o 00:03:47.880 CC lib/util/crc32c.o 00:03:47.880 CC lib/util/crc32.o 00:03:47.880 CC lib/vfio_user/host/vfio_user_pci.o 00:03:47.880 CC lib/util/crc32_ieee.o 00:03:47.880 CC lib/util/crc64.o 00:03:47.880 CC lib/util/dif.o 00:03:47.880 CC lib/util/fd.o 00:03:47.880 LIB libspdk_dma.a 00:03:47.880 CC lib/util/fd_group.o 00:03:47.880 CC lib/util/file.o 00:03:47.880 SO libspdk_dma.so.5.0 00:03:47.880 LIB libspdk_ioat.a 00:03:47.880 CC lib/util/hexlify.o 00:03:47.880 CC lib/util/iov.o 00:03:47.880 SO libspdk_ioat.so.7.0 00:03:47.880 SYMLINK libspdk_dma.so 00:03:47.880 CC lib/util/math.o 00:03:47.880 CC lib/util/net.o 00:03:47.880 SYMLINK libspdk_ioat.so 00:03:47.880 CC lib/util/pipe.o 00:03:47.880 CC lib/util/strerror_tls.o 00:03:47.880 CC lib/vfio_user/host/vfio_user.o 00:03:47.880 CC lib/util/string.o 00:03:47.880 CC lib/util/uuid.o 00:03:47.880 CC lib/util/xor.o 00:03:47.880 CC lib/util/zipf.o 00:03:47.880 CC lib/util/md5.o 00:03:47.880 LIB libspdk_vfio_user.a 00:03:47.880 SO libspdk_vfio_user.so.5.0 00:03:47.880 SYMLINK libspdk_vfio_user.so 00:03:47.880 LIB libspdk_util.a 00:03:47.880 SO libspdk_util.so.10.1 00:03:47.880 SYMLINK libspdk_util.so 00:03:47.880 LIB libspdk_trace_parser.a 00:03:47.880 SO libspdk_trace_parser.so.6.0 00:03:47.880 SYMLINK libspdk_trace_parser.so 00:03:47.880 CC lib/idxd/idxd_user.o 00:03:47.880 CC lib/idxd/idxd.o 00:03:47.880 CC lib/idxd/idxd_kernel.o 00:03:47.880 CC lib/conf/conf.o 00:03:47.880 CC lib/vmd/led.o 00:03:47.880 CC lib/vmd/vmd.o 00:03:47.880 CC lib/json/json_parse.o 00:03:47.880 CC lib/json/json_util.o 00:03:47.880 CC lib/rdma_utils/rdma_utils.o 00:03:47.880 CC lib/env_dpdk/env.o 00:03:47.881 CC lib/env_dpdk/memory.o 00:03:47.881 CC lib/json/json_write.o 00:03:47.881 CC lib/env_dpdk/pci.o 00:03:47.881 LIB libspdk_conf.a 00:03:47.881 CC lib/env_dpdk/init.o 00:03:47.881 CC lib/env_dpdk/threads.o 00:03:47.881 LIB libspdk_rdma_utils.a 00:03:47.881 SO libspdk_conf.so.6.0 00:03:47.881 SO libspdk_rdma_utils.so.1.0 00:03:47.881 SYMLINK libspdk_conf.so 00:03:47.881 CC lib/env_dpdk/pci_ioat.o 00:03:47.881 SYMLINK libspdk_rdma_utils.so 00:03:47.881 CC lib/env_dpdk/pci_virtio.o 00:03:47.881 CC lib/env_dpdk/pci_vmd.o 00:03:47.881 CC lib/env_dpdk/pci_idxd.o 00:03:47.881 LIB libspdk_json.a 00:03:47.881 CC lib/env_dpdk/pci_event.o 00:03:47.881 SO libspdk_json.so.6.0 00:03:47.881 CC lib/env_dpdk/sigbus_handler.o 00:03:47.881 CC lib/env_dpdk/pci_dpdk.o 00:03:47.881 SYMLINK libspdk_json.so 00:03:47.881 CC lib/env_dpdk/pci_dpdk_2207.o 00:03:47.881 LIB libspdk_vmd.a 00:03:47.881 SO libspdk_vmd.so.6.0 00:03:47.881 CC lib/env_dpdk/pci_dpdk_2211.o 00:03:47.881 SYMLINK libspdk_vmd.so 00:03:47.881 CC lib/rdma_provider/common.o 00:03:47.881 CC lib/rdma_provider/rdma_provider_verbs.o 00:03:47.881 LIB libspdk_idxd.a 00:03:47.881 SO libspdk_idxd.so.12.1 00:03:47.881 SYMLINK libspdk_idxd.so 00:03:47.881 CC lib/jsonrpc/jsonrpc_server.o 00:03:47.881 CC lib/jsonrpc/jsonrpc_client.o 00:03:47.881 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:03:47.881 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:03:48.141 LIB libspdk_rdma_provider.a 00:03:48.141 SO libspdk_rdma_provider.so.7.0 00:03:48.141 SYMLINK libspdk_rdma_provider.so 00:03:48.141 LIB libspdk_jsonrpc.a 00:03:48.141 SO libspdk_jsonrpc.so.6.0 00:03:48.402 SYMLINK libspdk_jsonrpc.so 00:03:48.663 CC lib/rpc/rpc.o 00:03:48.663 LIB libspdk_env_dpdk.a 00:03:48.663 SO libspdk_env_dpdk.so.15.1 00:03:48.663 LIB libspdk_rpc.a 00:03:48.663 SO libspdk_rpc.so.6.0 00:03:48.924 SYMLINK libspdk_rpc.so 00:03:48.924 SYMLINK libspdk_env_dpdk.so 00:03:48.924 CC lib/notify/notify.o 00:03:48.924 CC lib/notify/notify_rpc.o 00:03:48.924 CC lib/trace/trace.o 00:03:48.924 CC lib/trace/trace_flags.o 00:03:48.924 CC lib/trace/trace_rpc.o 00:03:48.924 CC lib/keyring/keyring.o 00:03:48.924 CC lib/keyring/keyring_rpc.o 00:03:49.182 LIB libspdk_notify.a 00:03:49.182 SO libspdk_notify.so.6.0 00:03:49.182 SYMLINK libspdk_notify.so 00:03:49.182 LIB libspdk_keyring.a 00:03:49.182 LIB libspdk_trace.a 00:03:49.182 SO libspdk_keyring.so.2.0 00:03:49.182 SO libspdk_trace.so.11.0 00:03:49.182 SYMLINK libspdk_keyring.so 00:03:49.441 SYMLINK libspdk_trace.so 00:03:49.441 CC lib/sock/sock.o 00:03:49.441 CC lib/sock/sock_rpc.o 00:03:49.441 CC lib/thread/thread.o 00:03:49.441 CC lib/thread/iobuf.o 00:03:50.007 LIB libspdk_sock.a 00:03:50.007 SO libspdk_sock.so.10.0 00:03:50.007 SYMLINK libspdk_sock.so 00:03:50.267 CC lib/nvme/nvme_fabric.o 00:03:50.267 CC lib/nvme/nvme_ctrlr.o 00:03:50.267 CC lib/nvme/nvme_ns_cmd.o 00:03:50.267 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:50.267 CC lib/nvme/nvme_pcie.o 00:03:50.267 CC lib/nvme/nvme_pcie_common.o 00:03:50.267 CC lib/nvme/nvme_ns.o 00:03:50.267 CC lib/nvme/nvme_qpair.o 00:03:50.267 CC lib/nvme/nvme.o 00:03:50.838 CC lib/nvme/nvme_quirks.o 00:03:50.838 CC lib/nvme/nvme_transport.o 00:03:50.838 CC lib/nvme/nvme_discovery.o 00:03:51.098 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:51.098 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:51.098 CC lib/nvme/nvme_tcp.o 00:03:51.098 LIB libspdk_thread.a 00:03:51.098 CC lib/nvme/nvme_opal.o 00:03:51.098 SO libspdk_thread.so.11.0 00:03:51.098 CC lib/nvme/nvme_io_msg.o 00:03:51.360 SYMLINK libspdk_thread.so 00:03:51.360 CC lib/nvme/nvme_poll_group.o 00:03:51.360 CC lib/nvme/nvme_zns.o 00:03:51.360 CC lib/accel/accel.o 00:03:51.620 CC lib/accel/accel_rpc.o 00:03:51.620 CC lib/accel/accel_sw.o 00:03:51.620 CC lib/nvme/nvme_stubs.o 00:03:51.620 CC lib/nvme/nvme_auth.o 00:03:51.620 CC lib/nvme/nvme_cuse.o 00:03:51.620 CC lib/nvme/nvme_rdma.o 00:03:51.881 CC lib/blob/blobstore.o 00:03:51.881 CC lib/blob/request.o 00:03:51.881 CC lib/init/json_config.o 00:03:52.142 CC lib/virtio/virtio.o 00:03:52.142 CC lib/init/subsystem.o 00:03:52.403 CC lib/init/subsystem_rpc.o 00:03:52.403 CC lib/fsdev/fsdev.o 00:03:52.403 CC lib/fsdev/fsdev_io.o 00:03:52.403 CC lib/init/rpc.o 00:03:52.403 CC lib/virtio/virtio_vhost_user.o 00:03:52.403 CC lib/fsdev/fsdev_rpc.o 00:03:52.403 CC lib/blob/zeroes.o 00:03:52.403 CC lib/virtio/virtio_vfio_user.o 00:03:52.403 LIB libspdk_init.a 00:03:52.403 SO libspdk_init.so.6.0 00:03:52.664 CC lib/virtio/virtio_pci.o 00:03:52.664 SYMLINK libspdk_init.so 00:03:52.664 CC lib/blob/blob_bs_dev.o 00:03:52.664 LIB libspdk_accel.a 00:03:52.664 SO libspdk_accel.so.16.0 00:03:52.664 SYMLINK libspdk_accel.so 00:03:52.925 LIB libspdk_fsdev.a 00:03:52.925 CC lib/event/app.o 00:03:52.925 CC lib/event/log_rpc.o 00:03:52.925 CC lib/event/scheduler_static.o 00:03:52.926 CC lib/event/app_rpc.o 00:03:52.926 CC lib/event/reactor.o 00:03:52.926 LIB libspdk_virtio.a 00:03:52.926 CC lib/bdev/bdev.o 00:03:52.926 SO libspdk_fsdev.so.2.0 00:03:52.926 SO libspdk_virtio.so.7.0 00:03:52.926 SYMLINK libspdk_fsdev.so 00:03:52.926 SYMLINK libspdk_virtio.so 00:03:52.926 CC lib/bdev/bdev_rpc.o 00:03:52.926 CC lib/bdev/bdev_zone.o 00:03:52.926 CC lib/bdev/part.o 00:03:52.926 LIB libspdk_nvme.a 00:03:52.926 CC lib/fuse_dispatcher/fuse_dispatcher.o 00:03:53.187 CC lib/bdev/scsi_nvme.o 00:03:53.187 SO libspdk_nvme.so.15.0 00:03:53.187 LIB libspdk_event.a 00:03:53.187 SO libspdk_event.so.14.0 00:03:53.447 SYMLINK libspdk_event.so 00:03:53.447 SYMLINK libspdk_nvme.so 00:03:53.709 LIB libspdk_fuse_dispatcher.a 00:03:53.709 SO libspdk_fuse_dispatcher.so.1.0 00:03:53.709 SYMLINK libspdk_fuse_dispatcher.so 00:03:54.650 LIB libspdk_blob.a 00:03:54.650 SO libspdk_blob.so.12.0 00:03:54.650 SYMLINK libspdk_blob.so 00:03:54.911 CC lib/blobfs/blobfs.o 00:03:54.911 CC lib/blobfs/tree.o 00:03:54.911 CC lib/lvol/lvol.o 00:03:55.170 LIB libspdk_bdev.a 00:03:55.170 SO libspdk_bdev.so.17.0 00:03:55.170 SYMLINK libspdk_bdev.so 00:03:55.431 CC lib/scsi/dev.o 00:03:55.431 CC lib/scsi/scsi.o 00:03:55.431 CC lib/scsi/port.o 00:03:55.431 CC lib/scsi/lun.o 00:03:55.431 CC lib/ublk/ublk.o 00:03:55.431 CC lib/ftl/ftl_core.o 00:03:55.431 CC lib/nbd/nbd.o 00:03:55.431 CC lib/nvmf/ctrlr.o 00:03:55.431 CC lib/nvmf/ctrlr_discovery.o 00:03:55.431 CC lib/nvmf/ctrlr_bdev.o 00:03:55.691 CC lib/nvmf/subsystem.o 00:03:55.691 LIB libspdk_lvol.a 00:03:55.691 SO libspdk_lvol.so.11.0 00:03:55.691 CC lib/scsi/scsi_bdev.o 00:03:55.691 SYMLINK libspdk_lvol.so 00:03:55.691 CC lib/scsi/scsi_pr.o 00:03:55.691 LIB libspdk_blobfs.a 00:03:55.691 SO libspdk_blobfs.so.11.0 00:03:55.691 CC lib/ftl/ftl_init.o 00:03:55.691 CC lib/nbd/nbd_rpc.o 00:03:55.951 SYMLINK libspdk_blobfs.so 00:03:55.951 CC lib/ftl/ftl_layout.o 00:03:55.951 CC lib/ftl/ftl_debug.o 00:03:55.951 CC lib/ublk/ublk_rpc.o 00:03:55.951 LIB libspdk_nbd.a 00:03:55.951 SO libspdk_nbd.so.7.0 00:03:55.951 CC lib/scsi/scsi_rpc.o 00:03:55.951 CC lib/scsi/task.o 00:03:55.951 SYMLINK libspdk_nbd.so 00:03:55.951 CC lib/ftl/ftl_io.o 00:03:55.951 CC lib/ftl/ftl_sb.o 00:03:56.212 CC lib/ftl/ftl_l2p.o 00:03:56.212 LIB libspdk_ublk.a 00:03:56.212 SO libspdk_ublk.so.3.0 00:03:56.212 CC lib/ftl/ftl_l2p_flat.o 00:03:56.212 CC lib/nvmf/nvmf.o 00:03:56.212 SYMLINK libspdk_ublk.so 00:03:56.212 CC lib/nvmf/nvmf_rpc.o 00:03:56.212 CC lib/nvmf/transport.o 00:03:56.212 LIB libspdk_scsi.a 00:03:56.212 CC lib/nvmf/tcp.o 00:03:56.212 CC lib/ftl/ftl_nv_cache.o 00:03:56.212 SO libspdk_scsi.so.9.0 00:03:56.212 CC lib/ftl/ftl_band.o 00:03:56.473 CC lib/ftl/ftl_band_ops.o 00:03:56.473 SYMLINK libspdk_scsi.so 00:03:56.473 CC lib/ftl/ftl_writer.o 00:03:56.473 CC lib/ftl/ftl_rq.o 00:03:56.735 CC lib/iscsi/conn.o 00:03:56.735 CC lib/iscsi/init_grp.o 00:03:56.735 CC lib/iscsi/iscsi.o 00:03:56.735 CC lib/iscsi/param.o 00:03:56.997 CC lib/iscsi/portal_grp.o 00:03:56.997 CC lib/ftl/ftl_reloc.o 00:03:56.997 CC lib/vhost/vhost.o 00:03:56.997 CC lib/vhost/vhost_rpc.o 00:03:56.997 CC lib/vhost/vhost_scsi.o 00:03:57.259 CC lib/iscsi/tgt_node.o 00:03:57.259 CC lib/iscsi/iscsi_subsystem.o 00:03:57.259 CC lib/iscsi/iscsi_rpc.o 00:03:57.259 CC lib/nvmf/stubs.o 00:03:57.259 CC lib/ftl/ftl_l2p_cache.o 00:03:57.520 CC lib/iscsi/task.o 00:03:57.520 CC lib/nvmf/mdns_server.o 00:03:57.780 CC lib/nvmf/rdma.o 00:03:57.780 CC lib/nvmf/auth.o 00:03:57.780 CC lib/vhost/vhost_blk.o 00:03:57.780 CC lib/vhost/rte_vhost_user.o 00:03:57.780 CC lib/ftl/ftl_p2l.o 00:03:57.780 CC lib/ftl/ftl_p2l_log.o 00:03:58.041 CC lib/ftl/mngt/ftl_mngt.o 00:03:58.041 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:58.041 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:58.041 LIB libspdk_iscsi.a 00:03:58.041 SO libspdk_iscsi.so.8.0 00:03:58.041 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:58.041 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:58.041 SYMLINK libspdk_iscsi.so 00:03:58.041 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:58.041 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:58.041 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:58.300 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:58.300 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:58.300 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:58.300 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:58.300 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:58.300 CC lib/ftl/utils/ftl_conf.o 00:03:58.300 CC lib/ftl/utils/ftl_md.o 00:03:58.560 CC lib/ftl/utils/ftl_mempool.o 00:03:58.560 CC lib/ftl/utils/ftl_bitmap.o 00:03:58.560 CC lib/ftl/utils/ftl_property.o 00:03:58.560 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:58.560 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:58.560 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:58.560 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:58.560 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:58.560 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:58.560 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:03:58.560 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:58.819 LIB libspdk_vhost.a 00:03:58.819 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:58.819 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:58.819 CC lib/ftl/nvc/ftl_nvc_bdev_non_vss.o 00:03:58.819 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:58.819 CC lib/ftl/nvc/ftl_nvc_bdev_common.o 00:03:58.819 SO libspdk_vhost.so.8.0 00:03:58.819 CC lib/ftl/base/ftl_base_dev.o 00:03:58.819 CC lib/ftl/base/ftl_base_bdev.o 00:03:58.819 SYMLINK libspdk_vhost.so 00:03:58.819 CC lib/ftl/ftl_trace.o 00:03:59.077 LIB libspdk_ftl.a 00:03:59.077 SO libspdk_ftl.so.9.0 00:03:59.336 SYMLINK libspdk_ftl.so 00:03:59.596 LIB libspdk_nvmf.a 00:03:59.855 SO libspdk_nvmf.so.20.0 00:03:59.855 SYMLINK libspdk_nvmf.so 00:04:00.114 CC module/env_dpdk/env_dpdk_rpc.o 00:04:00.372 CC module/blob/bdev/blob_bdev.o 00:04:00.372 CC module/scheduler/dynamic/scheduler_dynamic.o 00:04:00.372 CC module/sock/posix/posix.o 00:04:00.372 CC module/keyring/file/keyring.o 00:04:00.372 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:04:00.372 CC module/scheduler/gscheduler/gscheduler.o 00:04:00.372 CC module/keyring/linux/keyring.o 00:04:00.372 CC module/fsdev/aio/fsdev_aio.o 00:04:00.372 CC module/accel/error/accel_error.o 00:04:00.372 LIB libspdk_env_dpdk_rpc.a 00:04:00.372 SO libspdk_env_dpdk_rpc.so.6.0 00:04:00.372 LIB libspdk_scheduler_dpdk_governor.a 00:04:00.372 CC module/keyring/linux/keyring_rpc.o 00:04:00.372 SO libspdk_scheduler_dpdk_governor.so.4.0 00:04:00.372 CC module/keyring/file/keyring_rpc.o 00:04:00.372 LIB libspdk_scheduler_dynamic.a 00:04:00.372 SYMLINK libspdk_env_dpdk_rpc.so 00:04:00.372 LIB libspdk_scheduler_gscheduler.a 00:04:00.372 CC module/fsdev/aio/fsdev_aio_rpc.o 00:04:00.372 SO libspdk_scheduler_gscheduler.so.4.0 00:04:00.373 SO libspdk_scheduler_dynamic.so.4.0 00:04:00.373 SYMLINK libspdk_scheduler_dpdk_governor.so 00:04:00.373 SYMLINK libspdk_scheduler_dynamic.so 00:04:00.373 SYMLINK libspdk_scheduler_gscheduler.so 00:04:00.631 CC module/fsdev/aio/linux_aio_mgr.o 00:04:00.631 CC module/accel/error/accel_error_rpc.o 00:04:00.631 LIB libspdk_keyring_linux.a 00:04:00.631 SO libspdk_keyring_linux.so.1.0 00:04:00.631 LIB libspdk_keyring_file.a 00:04:00.631 LIB libspdk_blob_bdev.a 00:04:00.631 SO libspdk_keyring_file.so.2.0 00:04:00.631 SO libspdk_blob_bdev.so.12.0 00:04:00.631 SYMLINK libspdk_keyring_linux.so 00:04:00.631 SYMLINK libspdk_keyring_file.so 00:04:00.631 SYMLINK libspdk_blob_bdev.so 00:04:00.631 LIB libspdk_accel_error.a 00:04:00.631 CC module/accel/dsa/accel_dsa.o 00:04:00.631 CC module/accel/dsa/accel_dsa_rpc.o 00:04:00.631 CC module/accel/ioat/accel_ioat.o 00:04:00.631 SO libspdk_accel_error.so.2.0 00:04:00.631 CC module/accel/ioat/accel_ioat_rpc.o 00:04:00.631 CC module/accel/iaa/accel_iaa.o 00:04:00.631 SYMLINK libspdk_accel_error.so 00:04:00.631 CC module/accel/iaa/accel_iaa_rpc.o 00:04:00.890 LIB libspdk_accel_ioat.a 00:04:00.890 SO libspdk_accel_ioat.so.6.0 00:04:00.890 CC module/bdev/delay/vbdev_delay.o 00:04:00.890 CC module/blobfs/bdev/blobfs_bdev.o 00:04:00.890 LIB libspdk_accel_dsa.a 00:04:00.890 SO libspdk_accel_dsa.so.5.0 00:04:00.890 LIB libspdk_accel_iaa.a 00:04:00.890 CC module/bdev/error/vbdev_error.o 00:04:00.890 SYMLINK libspdk_accel_ioat.so 00:04:00.890 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:04:00.890 SO libspdk_accel_iaa.so.3.0 00:04:00.890 SYMLINK libspdk_accel_dsa.so 00:04:00.890 CC module/bdev/error/vbdev_error_rpc.o 00:04:00.890 CC module/bdev/gpt/gpt.o 00:04:00.890 LIB libspdk_sock_posix.a 00:04:00.890 SYMLINK libspdk_accel_iaa.so 00:04:00.890 CC module/bdev/delay/vbdev_delay_rpc.o 00:04:00.890 LIB libspdk_fsdev_aio.a 00:04:00.890 SO libspdk_sock_posix.so.6.0 00:04:00.890 CC module/bdev/lvol/vbdev_lvol.o 00:04:01.148 SO libspdk_fsdev_aio.so.1.0 00:04:01.148 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:04:01.148 SYMLINK libspdk_sock_posix.so 00:04:01.148 SYMLINK libspdk_fsdev_aio.so 00:04:01.148 CC module/bdev/gpt/vbdev_gpt.o 00:04:01.148 LIB libspdk_blobfs_bdev.a 00:04:01.148 SO libspdk_blobfs_bdev.so.6.0 00:04:01.148 LIB libspdk_bdev_error.a 00:04:01.148 SYMLINK libspdk_blobfs_bdev.so 00:04:01.148 SO libspdk_bdev_error.so.6.0 00:04:01.148 CC module/bdev/malloc/bdev_malloc.o 00:04:01.149 CC module/bdev/null/bdev_null.o 00:04:01.149 LIB libspdk_bdev_delay.a 00:04:01.149 CC module/bdev/nvme/bdev_nvme.o 00:04:01.149 SO libspdk_bdev_delay.so.6.0 00:04:01.149 SYMLINK libspdk_bdev_error.so 00:04:01.149 CC module/bdev/passthru/vbdev_passthru.o 00:04:01.407 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:04:01.407 SYMLINK libspdk_bdev_delay.so 00:04:01.407 CC module/bdev/malloc/bdev_malloc_rpc.o 00:04:01.407 CC module/bdev/raid/bdev_raid.o 00:04:01.407 LIB libspdk_bdev_gpt.a 00:04:01.407 SO libspdk_bdev_gpt.so.6.0 00:04:01.407 CC module/bdev/raid/bdev_raid_rpc.o 00:04:01.407 SYMLINK libspdk_bdev_gpt.so 00:04:01.407 CC module/bdev/raid/bdev_raid_sb.o 00:04:01.407 CC module/bdev/nvme/bdev_nvme_rpc.o 00:04:01.407 LIB libspdk_bdev_lvol.a 00:04:01.407 CC module/bdev/null/bdev_null_rpc.o 00:04:01.407 SO libspdk_bdev_lvol.so.6.0 00:04:01.407 CC module/bdev/nvme/nvme_rpc.o 00:04:01.407 SYMLINK libspdk_bdev_lvol.so 00:04:01.407 LIB libspdk_bdev_malloc.a 00:04:01.665 SO libspdk_bdev_malloc.so.6.0 00:04:01.665 CC module/bdev/raid/raid0.o 00:04:01.665 LIB libspdk_bdev_null.a 00:04:01.665 LIB libspdk_bdev_passthru.a 00:04:01.665 SO libspdk_bdev_null.so.6.0 00:04:01.665 SO libspdk_bdev_passthru.so.6.0 00:04:01.665 SYMLINK libspdk_bdev_malloc.so 00:04:01.665 CC module/bdev/raid/raid1.o 00:04:01.665 SYMLINK libspdk_bdev_passthru.so 00:04:01.665 CC module/bdev/raid/concat.o 00:04:01.665 SYMLINK libspdk_bdev_null.so 00:04:01.665 CC module/bdev/nvme/bdev_mdns_client.o 00:04:01.665 CC module/bdev/split/vbdev_split.o 00:04:01.665 CC module/bdev/split/vbdev_split_rpc.o 00:04:01.665 CC module/bdev/nvme/vbdev_opal.o 00:04:01.665 CC module/bdev/nvme/vbdev_opal_rpc.o 00:04:01.924 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:04:01.924 LIB libspdk_bdev_split.a 00:04:01.924 SO libspdk_bdev_split.so.6.0 00:04:01.924 SYMLINK libspdk_bdev_split.so 00:04:01.924 CC module/bdev/xnvme/bdev_xnvme.o 00:04:01.924 CC module/bdev/xnvme/bdev_xnvme_rpc.o 00:04:01.924 CC module/bdev/zone_block/vbdev_zone_block.o 00:04:01.924 CC module/bdev/aio/bdev_aio.o 00:04:02.182 CC module/bdev/ftl/bdev_ftl.o 00:04:02.182 CC module/bdev/aio/bdev_aio_rpc.o 00:04:02.182 CC module/bdev/virtio/bdev_virtio_scsi.o 00:04:02.182 CC module/bdev/iscsi/bdev_iscsi.o 00:04:02.182 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:04:02.182 LIB libspdk_bdev_raid.a 00:04:02.182 SO libspdk_bdev_raid.so.6.0 00:04:02.182 CC module/bdev/ftl/bdev_ftl_rpc.o 00:04:02.182 LIB libspdk_bdev_xnvme.a 00:04:02.182 SO libspdk_bdev_xnvme.so.3.0 00:04:02.182 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:04:02.182 CC module/bdev/virtio/bdev_virtio_blk.o 00:04:02.182 LIB libspdk_bdev_aio.a 00:04:02.182 SYMLINK libspdk_bdev_raid.so 00:04:02.182 CC module/bdev/virtio/bdev_virtio_rpc.o 00:04:02.182 SYMLINK libspdk_bdev_xnvme.so 00:04:02.182 SO libspdk_bdev_aio.so.6.0 00:04:02.465 SYMLINK libspdk_bdev_aio.so 00:04:02.465 LIB libspdk_bdev_zone_block.a 00:04:02.465 SO libspdk_bdev_zone_block.so.6.0 00:04:02.465 LIB libspdk_bdev_ftl.a 00:04:02.465 LIB libspdk_bdev_iscsi.a 00:04:02.465 SO libspdk_bdev_ftl.so.6.0 00:04:02.465 SYMLINK libspdk_bdev_zone_block.so 00:04:02.465 SO libspdk_bdev_iscsi.so.6.0 00:04:02.465 SYMLINK libspdk_bdev_ftl.so 00:04:02.465 SYMLINK libspdk_bdev_iscsi.so 00:04:02.465 LIB libspdk_bdev_virtio.a 00:04:02.465 SO libspdk_bdev_virtio.so.6.0 00:04:02.759 SYMLINK libspdk_bdev_virtio.so 00:04:03.324 LIB libspdk_bdev_nvme.a 00:04:03.582 SO libspdk_bdev_nvme.so.7.1 00:04:03.582 SYMLINK libspdk_bdev_nvme.so 00:04:04.148 CC module/event/subsystems/iobuf/iobuf.o 00:04:04.148 CC module/event/subsystems/sock/sock.o 00:04:04.148 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:04:04.148 CC module/event/subsystems/vmd/vmd.o 00:04:04.148 CC module/event/subsystems/vmd/vmd_rpc.o 00:04:04.148 CC module/event/subsystems/keyring/keyring.o 00:04:04.148 CC module/event/subsystems/fsdev/fsdev.o 00:04:04.148 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:04:04.148 CC module/event/subsystems/scheduler/scheduler.o 00:04:04.148 LIB libspdk_event_keyring.a 00:04:04.148 LIB libspdk_event_fsdev.a 00:04:04.148 LIB libspdk_event_vmd.a 00:04:04.148 LIB libspdk_event_sock.a 00:04:04.148 LIB libspdk_event_scheduler.a 00:04:04.148 LIB libspdk_event_vhost_blk.a 00:04:04.148 LIB libspdk_event_iobuf.a 00:04:04.148 SO libspdk_event_keyring.so.1.0 00:04:04.148 SO libspdk_event_sock.so.5.0 00:04:04.148 SO libspdk_event_fsdev.so.1.0 00:04:04.148 SO libspdk_event_vmd.so.6.0 00:04:04.148 SO libspdk_event_scheduler.so.4.0 00:04:04.148 SO libspdk_event_vhost_blk.so.3.0 00:04:04.148 SO libspdk_event_iobuf.so.3.0 00:04:04.148 SYMLINK libspdk_event_keyring.so 00:04:04.148 SYMLINK libspdk_event_sock.so 00:04:04.148 SYMLINK libspdk_event_fsdev.so 00:04:04.148 SYMLINK libspdk_event_vhost_blk.so 00:04:04.148 SYMLINK libspdk_event_scheduler.so 00:04:04.148 SYMLINK libspdk_event_vmd.so 00:04:04.148 SYMLINK libspdk_event_iobuf.so 00:04:04.406 CC module/event/subsystems/accel/accel.o 00:04:04.664 LIB libspdk_event_accel.a 00:04:04.664 SO libspdk_event_accel.so.6.0 00:04:04.664 SYMLINK libspdk_event_accel.so 00:04:04.922 CC module/event/subsystems/bdev/bdev.o 00:04:04.922 LIB libspdk_event_bdev.a 00:04:05.183 SO libspdk_event_bdev.so.6.0 00:04:05.183 SYMLINK libspdk_event_bdev.so 00:04:05.183 CC module/event/subsystems/scsi/scsi.o 00:04:05.183 CC module/event/subsystems/nbd/nbd.o 00:04:05.183 CC module/event/subsystems/ublk/ublk.o 00:04:05.183 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:04:05.183 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:04:05.445 LIB libspdk_event_nbd.a 00:04:05.445 LIB libspdk_event_ublk.a 00:04:05.445 SO libspdk_event_ublk.so.3.0 00:04:05.445 SO libspdk_event_nbd.so.6.0 00:04:05.445 LIB libspdk_event_scsi.a 00:04:05.445 SO libspdk_event_scsi.so.6.0 00:04:05.445 SYMLINK libspdk_event_nbd.so 00:04:05.445 SYMLINK libspdk_event_ublk.so 00:04:05.445 SYMLINK libspdk_event_scsi.so 00:04:05.445 LIB libspdk_event_nvmf.a 00:04:05.445 SO libspdk_event_nvmf.so.6.0 00:04:05.445 SYMLINK libspdk_event_nvmf.so 00:04:05.702 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:04:05.702 CC module/event/subsystems/iscsi/iscsi.o 00:04:05.702 LIB libspdk_event_vhost_scsi.a 00:04:05.702 LIB libspdk_event_iscsi.a 00:04:05.702 SO libspdk_event_vhost_scsi.so.3.0 00:04:05.959 SO libspdk_event_iscsi.so.6.0 00:04:05.959 SYMLINK libspdk_event_vhost_scsi.so 00:04:05.959 SYMLINK libspdk_event_iscsi.so 00:04:05.959 SO libspdk.so.6.0 00:04:05.959 SYMLINK libspdk.so 00:04:06.217 CC app/trace_record/trace_record.o 00:04:06.217 CXX app/trace/trace.o 00:04:06.217 CC app/spdk_lspci/spdk_lspci.o 00:04:06.217 CC examples/interrupt_tgt/interrupt_tgt.o 00:04:06.217 CC app/iscsi_tgt/iscsi_tgt.o 00:04:06.217 CC app/nvmf_tgt/nvmf_main.o 00:04:06.217 CC examples/util/zipf/zipf.o 00:04:06.217 CC app/spdk_tgt/spdk_tgt.o 00:04:06.217 CC examples/ioat/perf/perf.o 00:04:06.217 CC test/thread/poller_perf/poller_perf.o 00:04:06.217 LINK spdk_lspci 00:04:06.475 LINK iscsi_tgt 00:04:06.475 LINK nvmf_tgt 00:04:06.475 LINK interrupt_tgt 00:04:06.475 LINK poller_perf 00:04:06.475 LINK zipf 00:04:06.475 LINK spdk_trace_record 00:04:06.475 LINK spdk_tgt 00:04:06.475 LINK ioat_perf 00:04:06.475 LINK spdk_trace 00:04:06.475 CC app/spdk_nvme_perf/perf.o 00:04:06.475 CC app/spdk_nvme_identify/identify.o 00:04:06.733 CC app/spdk_nvme_discover/discovery_aer.o 00:04:06.733 CC examples/ioat/verify/verify.o 00:04:06.733 TEST_HEADER include/spdk/accel.h 00:04:06.733 TEST_HEADER include/spdk/accel_module.h 00:04:06.733 TEST_HEADER include/spdk/assert.h 00:04:06.733 TEST_HEADER include/spdk/barrier.h 00:04:06.733 TEST_HEADER include/spdk/base64.h 00:04:06.733 TEST_HEADER include/spdk/bdev.h 00:04:06.733 TEST_HEADER include/spdk/bdev_module.h 00:04:06.733 TEST_HEADER include/spdk/bdev_zone.h 00:04:06.733 TEST_HEADER include/spdk/bit_array.h 00:04:06.733 TEST_HEADER include/spdk/bit_pool.h 00:04:06.733 TEST_HEADER include/spdk/blob_bdev.h 00:04:06.733 TEST_HEADER include/spdk/blobfs_bdev.h 00:04:06.733 TEST_HEADER include/spdk/blobfs.h 00:04:06.733 TEST_HEADER include/spdk/blob.h 00:04:06.733 TEST_HEADER include/spdk/conf.h 00:04:06.733 TEST_HEADER include/spdk/config.h 00:04:06.733 TEST_HEADER include/spdk/cpuset.h 00:04:06.733 TEST_HEADER include/spdk/crc16.h 00:04:06.733 TEST_HEADER include/spdk/crc32.h 00:04:06.733 TEST_HEADER include/spdk/crc64.h 00:04:06.733 TEST_HEADER include/spdk/dif.h 00:04:06.733 TEST_HEADER include/spdk/dma.h 00:04:06.733 TEST_HEADER include/spdk/endian.h 00:04:06.733 TEST_HEADER include/spdk/env_dpdk.h 00:04:06.733 TEST_HEADER include/spdk/env.h 00:04:06.733 TEST_HEADER include/spdk/event.h 00:04:06.733 TEST_HEADER include/spdk/fd_group.h 00:04:06.733 TEST_HEADER include/spdk/fd.h 00:04:06.733 TEST_HEADER include/spdk/file.h 00:04:06.733 TEST_HEADER include/spdk/fsdev.h 00:04:06.733 TEST_HEADER include/spdk/fsdev_module.h 00:04:06.733 TEST_HEADER include/spdk/ftl.h 00:04:06.733 TEST_HEADER include/spdk/gpt_spec.h 00:04:06.733 TEST_HEADER include/spdk/hexlify.h 00:04:06.733 TEST_HEADER include/spdk/histogram_data.h 00:04:06.733 CC test/dma/test_dma/test_dma.o 00:04:06.733 TEST_HEADER include/spdk/idxd.h 00:04:06.733 TEST_HEADER include/spdk/idxd_spec.h 00:04:06.733 TEST_HEADER include/spdk/init.h 00:04:06.733 TEST_HEADER include/spdk/ioat.h 00:04:06.734 CC examples/thread/thread/thread_ex.o 00:04:06.734 TEST_HEADER include/spdk/ioat_spec.h 00:04:06.734 TEST_HEADER include/spdk/iscsi_spec.h 00:04:06.734 TEST_HEADER include/spdk/json.h 00:04:06.734 TEST_HEADER include/spdk/jsonrpc.h 00:04:06.734 TEST_HEADER include/spdk/keyring.h 00:04:06.734 TEST_HEADER include/spdk/keyring_module.h 00:04:06.734 TEST_HEADER include/spdk/likely.h 00:04:06.734 TEST_HEADER include/spdk/log.h 00:04:06.734 TEST_HEADER include/spdk/lvol.h 00:04:06.734 TEST_HEADER include/spdk/md5.h 00:04:06.734 TEST_HEADER include/spdk/memory.h 00:04:06.734 TEST_HEADER include/spdk/mmio.h 00:04:06.734 TEST_HEADER include/spdk/nbd.h 00:04:06.734 TEST_HEADER include/spdk/net.h 00:04:06.734 TEST_HEADER include/spdk/notify.h 00:04:06.734 CC test/app/bdev_svc/bdev_svc.o 00:04:06.734 TEST_HEADER include/spdk/nvme.h 00:04:06.734 TEST_HEADER include/spdk/nvme_intel.h 00:04:06.734 TEST_HEADER include/spdk/nvme_ocssd.h 00:04:06.734 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:04:06.734 TEST_HEADER include/spdk/nvme_spec.h 00:04:06.734 TEST_HEADER include/spdk/nvme_zns.h 00:04:06.734 TEST_HEADER include/spdk/nvmf_cmd.h 00:04:06.734 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:04:06.734 TEST_HEADER include/spdk/nvmf.h 00:04:06.734 TEST_HEADER include/spdk/nvmf_spec.h 00:04:06.734 TEST_HEADER include/spdk/nvmf_transport.h 00:04:06.734 TEST_HEADER include/spdk/opal.h 00:04:06.734 TEST_HEADER include/spdk/opal_spec.h 00:04:06.734 TEST_HEADER include/spdk/pci_ids.h 00:04:06.734 TEST_HEADER include/spdk/pipe.h 00:04:06.734 TEST_HEADER include/spdk/queue.h 00:04:06.734 TEST_HEADER include/spdk/reduce.h 00:04:06.734 TEST_HEADER include/spdk/rpc.h 00:04:06.734 TEST_HEADER include/spdk/scheduler.h 00:04:06.734 TEST_HEADER include/spdk/scsi.h 00:04:06.734 TEST_HEADER include/spdk/scsi_spec.h 00:04:06.734 TEST_HEADER include/spdk/sock.h 00:04:06.734 CC examples/sock/hello_world/hello_sock.o 00:04:06.734 TEST_HEADER include/spdk/stdinc.h 00:04:06.734 TEST_HEADER include/spdk/string.h 00:04:06.734 TEST_HEADER include/spdk/thread.h 00:04:06.734 TEST_HEADER include/spdk/trace.h 00:04:06.734 TEST_HEADER include/spdk/trace_parser.h 00:04:06.734 LINK verify 00:04:06.734 TEST_HEADER include/spdk/tree.h 00:04:06.734 TEST_HEADER include/spdk/ublk.h 00:04:06.734 TEST_HEADER include/spdk/util.h 00:04:06.734 TEST_HEADER include/spdk/uuid.h 00:04:06.734 TEST_HEADER include/spdk/version.h 00:04:06.734 TEST_HEADER include/spdk/vfio_user_pci.h 00:04:06.734 TEST_HEADER include/spdk/vfio_user_spec.h 00:04:06.734 TEST_HEADER include/spdk/vhost.h 00:04:06.734 TEST_HEADER include/spdk/vmd.h 00:04:06.734 TEST_HEADER include/spdk/xor.h 00:04:06.734 LINK spdk_nvme_discover 00:04:06.734 TEST_HEADER include/spdk/zipf.h 00:04:06.734 CXX test/cpp_headers/accel.o 00:04:06.992 CXX test/cpp_headers/accel_module.o 00:04:06.992 LINK thread 00:04:06.992 CC test/env/mem_callbacks/mem_callbacks.o 00:04:06.992 LINK bdev_svc 00:04:06.992 CXX test/cpp_headers/assert.o 00:04:06.992 LINK hello_sock 00:04:06.992 CC test/env/vtophys/vtophys.o 00:04:06.992 CXX test/cpp_headers/barrier.o 00:04:07.249 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:04:07.249 LINK test_dma 00:04:07.249 LINK vtophys 00:04:07.249 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:04:07.249 CC test/event/event_perf/event_perf.o 00:04:07.249 CXX test/cpp_headers/base64.o 00:04:07.249 LINK env_dpdk_post_init 00:04:07.249 LINK spdk_nvme_identify 00:04:07.249 CC examples/vmd/lsvmd/lsvmd.o 00:04:07.249 LINK event_perf 00:04:07.507 CXX test/cpp_headers/bdev.o 00:04:07.507 CC test/event/reactor/reactor.o 00:04:07.507 CC test/event/reactor_perf/reactor_perf.o 00:04:07.507 LINK spdk_nvme_perf 00:04:07.507 LINK lsvmd 00:04:07.507 LINK mem_callbacks 00:04:07.507 CC test/event/app_repeat/app_repeat.o 00:04:07.507 LINK reactor 00:04:07.507 CC test/rpc_client/rpc_client_test.o 00:04:07.507 CXX test/cpp_headers/bdev_module.o 00:04:07.507 LINK nvme_fuzz 00:04:07.507 LINK reactor_perf 00:04:07.507 CC test/event/scheduler/scheduler.o 00:04:07.507 CC test/env/memory/memory_ut.o 00:04:07.507 CC app/spdk_top/spdk_top.o 00:04:07.507 CXX test/cpp_headers/bdev_zone.o 00:04:07.507 CC examples/vmd/led/led.o 00:04:07.507 LINK app_repeat 00:04:07.764 LINK rpc_client_test 00:04:07.764 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:04:07.764 CXX test/cpp_headers/bit_array.o 00:04:07.764 CXX test/cpp_headers/bit_pool.o 00:04:07.764 LINK led 00:04:07.764 CC test/app/histogram_perf/histogram_perf.o 00:04:07.764 LINK scheduler 00:04:07.764 CXX test/cpp_headers/blob_bdev.o 00:04:07.764 CC test/accel/dif/dif.o 00:04:07.764 LINK histogram_perf 00:04:08.021 CXX test/cpp_headers/blobfs_bdev.o 00:04:08.021 CC test/app/jsoncat/jsoncat.o 00:04:08.021 CC test/app/stub/stub.o 00:04:08.021 CC examples/idxd/perf/perf.o 00:04:08.021 LINK jsoncat 00:04:08.021 CXX test/cpp_headers/blobfs.o 00:04:08.021 LINK stub 00:04:08.021 CC examples/fsdev/hello_world/hello_fsdev.o 00:04:08.021 CC examples/accel/perf/accel_perf.o 00:04:08.279 CXX test/cpp_headers/blob.o 00:04:08.279 LINK idxd_perf 00:04:08.279 CC examples/blob/hello_world/hello_blob.o 00:04:08.279 CXX test/cpp_headers/conf.o 00:04:08.279 CC examples/nvme/hello_world/hello_world.o 00:04:08.279 LINK hello_fsdev 00:04:08.279 LINK dif 00:04:08.537 CXX test/cpp_headers/config.o 00:04:08.537 LINK memory_ut 00:04:08.537 CXX test/cpp_headers/cpuset.o 00:04:08.537 LINK spdk_top 00:04:08.537 LINK hello_world 00:04:08.537 LINK hello_blob 00:04:08.537 CC test/blobfs/mkfs/mkfs.o 00:04:08.537 CXX test/cpp_headers/crc16.o 00:04:08.537 LINK accel_perf 00:04:08.537 CC test/env/pci/pci_ut.o 00:04:08.796 CC test/nvme/aer/aer.o 00:04:08.796 CC app/vhost/vhost.o 00:04:08.796 CC test/lvol/esnap/esnap.o 00:04:08.796 CXX test/cpp_headers/crc32.o 00:04:08.796 CC examples/nvme/reconnect/reconnect.o 00:04:08.796 LINK mkfs 00:04:08.796 CXX test/cpp_headers/crc64.o 00:04:08.796 CC examples/blob/cli/blobcli.o 00:04:08.796 LINK vhost 00:04:08.796 CXX test/cpp_headers/dif.o 00:04:08.796 CXX test/cpp_headers/dma.o 00:04:09.054 LINK aer 00:04:09.054 LINK pci_ut 00:04:09.054 CXX test/cpp_headers/endian.o 00:04:09.054 CC examples/bdev/hello_world/hello_bdev.o 00:04:09.054 LINK reconnect 00:04:09.054 CC test/nvme/reset/reset.o 00:04:09.054 CC app/spdk_dd/spdk_dd.o 00:04:09.054 LINK iscsi_fuzz 00:04:09.054 CXX test/cpp_headers/env_dpdk.o 00:04:09.054 CXX test/cpp_headers/env.o 00:04:09.054 CC app/fio/nvme/fio_plugin.o 00:04:09.054 LINK blobcli 00:04:09.313 LINK hello_bdev 00:04:09.313 CC examples/nvme/nvme_manage/nvme_manage.o 00:04:09.313 CXX test/cpp_headers/event.o 00:04:09.313 CXX test/cpp_headers/fd_group.o 00:04:09.313 LINK reset 00:04:09.313 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:04:09.313 LINK spdk_dd 00:04:09.313 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:04:09.313 CC test/bdev/bdevio/bdevio.o 00:04:09.313 CXX test/cpp_headers/fd.o 00:04:09.571 CXX test/cpp_headers/file.o 00:04:09.571 CC examples/bdev/bdevperf/bdevperf.o 00:04:09.571 CC test/nvme/sgl/sgl.o 00:04:09.571 CXX test/cpp_headers/fsdev.o 00:04:09.571 CC test/nvme/e2edp/nvme_dp.o 00:04:09.571 CC examples/nvme/arbitration/arbitration.o 00:04:09.571 LINK spdk_nvme 00:04:09.571 LINK nvme_manage 00:04:09.572 CXX test/cpp_headers/fsdev_module.o 00:04:09.829 LINK vhost_fuzz 00:04:09.829 LINK sgl 00:04:09.829 CC app/fio/bdev/fio_plugin.o 00:04:09.829 CXX test/cpp_headers/ftl.o 00:04:09.829 LINK bdevio 00:04:09.829 CC examples/nvme/hotplug/hotplug.o 00:04:09.829 LINK nvme_dp 00:04:09.829 LINK arbitration 00:04:09.829 CC test/nvme/err_injection/err_injection.o 00:04:09.829 CC test/nvme/overhead/overhead.o 00:04:09.829 CXX test/cpp_headers/gpt_spec.o 00:04:10.088 LINK hotplug 00:04:10.088 CC examples/nvme/cmb_copy/cmb_copy.o 00:04:10.088 CC test/nvme/startup/startup.o 00:04:10.088 LINK err_injection 00:04:10.088 CC examples/nvme/abort/abort.o 00:04:10.088 CXX test/cpp_headers/hexlify.o 00:04:10.088 CXX test/cpp_headers/histogram_data.o 00:04:10.088 CXX test/cpp_headers/idxd.o 00:04:10.088 LINK bdevperf 00:04:10.088 LINK spdk_bdev 00:04:10.088 LINK startup 00:04:10.088 LINK overhead 00:04:10.088 LINK cmb_copy 00:04:10.088 CXX test/cpp_headers/idxd_spec.o 00:04:10.346 CXX test/cpp_headers/init.o 00:04:10.346 CXX test/cpp_headers/ioat.o 00:04:10.346 CXX test/cpp_headers/ioat_spec.o 00:04:10.346 CXX test/cpp_headers/iscsi_spec.o 00:04:10.346 LINK abort 00:04:10.346 CXX test/cpp_headers/json.o 00:04:10.346 CXX test/cpp_headers/jsonrpc.o 00:04:10.346 CXX test/cpp_headers/keyring.o 00:04:10.346 CXX test/cpp_headers/keyring_module.o 00:04:10.346 CC test/nvme/reserve/reserve.o 00:04:10.346 CXX test/cpp_headers/likely.o 00:04:10.346 CC test/nvme/simple_copy/simple_copy.o 00:04:10.346 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:04:10.346 CXX test/cpp_headers/log.o 00:04:10.346 CXX test/cpp_headers/lvol.o 00:04:10.346 CXX test/cpp_headers/md5.o 00:04:10.346 CXX test/cpp_headers/memory.o 00:04:10.605 CC test/nvme/connect_stress/connect_stress.o 00:04:10.605 CXX test/cpp_headers/mmio.o 00:04:10.605 LINK pmr_persistence 00:04:10.605 CXX test/cpp_headers/nbd.o 00:04:10.605 LINK reserve 00:04:10.605 CXX test/cpp_headers/net.o 00:04:10.605 CXX test/cpp_headers/notify.o 00:04:10.605 LINK simple_copy 00:04:10.605 CC test/nvme/boot_partition/boot_partition.o 00:04:10.605 CXX test/cpp_headers/nvme.o 00:04:10.605 LINK connect_stress 00:04:10.605 CXX test/cpp_headers/nvme_intel.o 00:04:10.605 CXX test/cpp_headers/nvme_ocssd.o 00:04:10.605 CC test/nvme/compliance/nvme_compliance.o 00:04:10.863 CC test/nvme/fused_ordering/fused_ordering.o 00:04:10.863 LINK boot_partition 00:04:10.863 CXX test/cpp_headers/nvme_ocssd_spec.o 00:04:10.863 CC test/nvme/doorbell_aers/doorbell_aers.o 00:04:10.863 CC test/nvme/fdp/fdp.o 00:04:10.863 CXX test/cpp_headers/nvme_spec.o 00:04:10.863 CXX test/cpp_headers/nvme_zns.o 00:04:10.863 CC examples/nvmf/nvmf/nvmf.o 00:04:10.863 LINK fused_ordering 00:04:10.863 CXX test/cpp_headers/nvmf_cmd.o 00:04:10.863 LINK nvme_compliance 00:04:10.863 CXX test/cpp_headers/nvmf_fc_spec.o 00:04:10.863 CC test/nvme/cuse/cuse.o 00:04:11.121 LINK doorbell_aers 00:04:11.121 CXX test/cpp_headers/nvmf.o 00:04:11.121 CXX test/cpp_headers/nvmf_spec.o 00:04:11.121 CXX test/cpp_headers/nvmf_transport.o 00:04:11.121 CXX test/cpp_headers/opal.o 00:04:11.121 LINK fdp 00:04:11.121 CXX test/cpp_headers/opal_spec.o 00:04:11.121 CXX test/cpp_headers/pci_ids.o 00:04:11.121 CXX test/cpp_headers/pipe.o 00:04:11.121 LINK nvmf 00:04:11.121 CXX test/cpp_headers/queue.o 00:04:11.121 CXX test/cpp_headers/reduce.o 00:04:11.121 CXX test/cpp_headers/rpc.o 00:04:11.121 CXX test/cpp_headers/scheduler.o 00:04:11.121 CXX test/cpp_headers/scsi.o 00:04:11.121 CXX test/cpp_headers/scsi_spec.o 00:04:11.380 CXX test/cpp_headers/sock.o 00:04:11.380 CXX test/cpp_headers/stdinc.o 00:04:11.380 CXX test/cpp_headers/string.o 00:04:11.380 CXX test/cpp_headers/thread.o 00:04:11.380 CXX test/cpp_headers/trace.o 00:04:11.380 CXX test/cpp_headers/trace_parser.o 00:04:11.380 CXX test/cpp_headers/tree.o 00:04:11.380 CXX test/cpp_headers/ublk.o 00:04:11.380 CXX test/cpp_headers/util.o 00:04:11.380 CXX test/cpp_headers/uuid.o 00:04:11.380 CXX test/cpp_headers/version.o 00:04:11.380 CXX test/cpp_headers/vfio_user_pci.o 00:04:11.380 CXX test/cpp_headers/vfio_user_spec.o 00:04:11.380 CXX test/cpp_headers/vhost.o 00:04:11.380 CXX test/cpp_headers/vmd.o 00:04:11.380 CXX test/cpp_headers/xor.o 00:04:11.380 CXX test/cpp_headers/zipf.o 00:04:11.947 LINK cuse 00:04:13.849 LINK esnap 00:04:14.108 ************************************ 00:04:14.108 END TEST make 00:04:14.108 ************************************ 00:04:14.108 00:04:14.108 real 1m1.910s 00:04:14.108 user 4m58.060s 00:04:14.108 sys 0m51.025s 00:04:14.108 04:54:34 make -- common/autotest_common.sh@1130 -- $ xtrace_disable 00:04:14.108 04:54:34 make -- common/autotest_common.sh@10 -- $ set +x 00:04:14.108 04:54:34 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:04:14.108 04:54:34 -- pm/common@29 -- $ signal_monitor_resources TERM 00:04:14.108 04:54:34 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:04:14.108 04:54:34 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:14.108 04:54:34 -- pm/common@43 -- $ [[ -e /home/vagrant/spdk_repo/spdk/../output/power/collect-cpu-load.pid ]] 00:04:14.108 04:54:34 -- pm/common@44 -- $ pid=5801 00:04:14.108 04:54:34 -- pm/common@50 -- $ kill -TERM 5801 00:04:14.108 04:54:34 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:04:14.108 04:54:34 -- pm/common@43 -- $ [[ -e /home/vagrant/spdk_repo/spdk/../output/power/collect-vmstat.pid ]] 00:04:14.108 04:54:34 -- pm/common@44 -- $ pid=5803 00:04:14.108 04:54:34 -- pm/common@50 -- $ kill -TERM 5803 00:04:14.108 04:54:34 -- spdk/autorun.sh@26 -- $ (( SPDK_TEST_UNITTEST == 1 || SPDK_RUN_FUNCTIONAL_TEST == 1 )) 00:04:14.108 04:54:34 -- spdk/autorun.sh@27 -- $ sudo -E /home/vagrant/spdk_repo/spdk/autotest.sh /home/vagrant/spdk_repo/autorun-spdk.conf 00:04:14.108 04:54:34 -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:04:14.108 04:54:34 -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:04:14.108 04:54:34 -- common/autotest_common.sh@1711 -- # lcov --version 00:04:14.108 04:54:34 -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:04:14.108 04:54:34 -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:14.108 04:54:34 -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:14.108 04:54:34 -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:14.108 04:54:34 -- scripts/common.sh@336 -- # IFS=.-: 00:04:14.108 04:54:34 -- scripts/common.sh@336 -- # read -ra ver1 00:04:14.108 04:54:34 -- scripts/common.sh@337 -- # IFS=.-: 00:04:14.108 04:54:34 -- scripts/common.sh@337 -- # read -ra ver2 00:04:14.108 04:54:34 -- scripts/common.sh@338 -- # local 'op=<' 00:04:14.108 04:54:34 -- scripts/common.sh@340 -- # ver1_l=2 00:04:14.108 04:54:34 -- scripts/common.sh@341 -- # ver2_l=1 00:04:14.108 04:54:34 -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:14.108 04:54:34 -- scripts/common.sh@344 -- # case "$op" in 00:04:14.108 04:54:34 -- scripts/common.sh@345 -- # : 1 00:04:14.108 04:54:34 -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:14.108 04:54:34 -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:14.108 04:54:34 -- scripts/common.sh@365 -- # decimal 1 00:04:14.108 04:54:34 -- scripts/common.sh@353 -- # local d=1 00:04:14.108 04:54:34 -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:14.108 04:54:34 -- scripts/common.sh@355 -- # echo 1 00:04:14.108 04:54:34 -- scripts/common.sh@365 -- # ver1[v]=1 00:04:14.108 04:54:34 -- scripts/common.sh@366 -- # decimal 2 00:04:14.108 04:54:34 -- scripts/common.sh@353 -- # local d=2 00:04:14.108 04:54:34 -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:14.108 04:54:34 -- scripts/common.sh@355 -- # echo 2 00:04:14.108 04:54:34 -- scripts/common.sh@366 -- # ver2[v]=2 00:04:14.108 04:54:34 -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:14.108 04:54:34 -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:14.108 04:54:34 -- scripts/common.sh@368 -- # return 0 00:04:14.108 04:54:34 -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:14.108 04:54:34 -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:04:14.108 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:14.108 --rc genhtml_branch_coverage=1 00:04:14.108 --rc genhtml_function_coverage=1 00:04:14.108 --rc genhtml_legend=1 00:04:14.108 --rc geninfo_all_blocks=1 00:04:14.108 --rc geninfo_unexecuted_blocks=1 00:04:14.108 00:04:14.108 ' 00:04:14.108 04:54:34 -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:04:14.108 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:14.108 --rc genhtml_branch_coverage=1 00:04:14.108 --rc genhtml_function_coverage=1 00:04:14.108 --rc genhtml_legend=1 00:04:14.108 --rc geninfo_all_blocks=1 00:04:14.108 --rc geninfo_unexecuted_blocks=1 00:04:14.108 00:04:14.108 ' 00:04:14.108 04:54:34 -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:04:14.108 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:14.108 --rc genhtml_branch_coverage=1 00:04:14.108 --rc genhtml_function_coverage=1 00:04:14.108 --rc genhtml_legend=1 00:04:14.108 --rc geninfo_all_blocks=1 00:04:14.108 --rc geninfo_unexecuted_blocks=1 00:04:14.108 00:04:14.108 ' 00:04:14.108 04:54:34 -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:04:14.108 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:14.108 --rc genhtml_branch_coverage=1 00:04:14.108 --rc genhtml_function_coverage=1 00:04:14.108 --rc genhtml_legend=1 00:04:14.108 --rc geninfo_all_blocks=1 00:04:14.108 --rc geninfo_unexecuted_blocks=1 00:04:14.108 00:04:14.108 ' 00:04:14.108 04:54:34 -- spdk/autotest.sh@25 -- # source /home/vagrant/spdk_repo/spdk/test/nvmf/common.sh 00:04:14.108 04:54:34 -- nvmf/common.sh@7 -- # uname -s 00:04:14.108 04:54:34 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:04:14.108 04:54:34 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:04:14.108 04:54:34 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:04:14.108 04:54:34 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:04:14.108 04:54:34 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:04:14.108 04:54:34 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:04:14.108 04:54:34 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:04:14.108 04:54:34 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:04:14.108 04:54:34 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:04:14.108 04:54:34 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:04:14.108 04:54:34 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:9174b0f4-4dff-4414-95f3-547baa722471 00:04:14.108 04:54:34 -- nvmf/common.sh@18 -- # NVME_HOSTID=9174b0f4-4dff-4414-95f3-547baa722471 00:04:14.108 04:54:34 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:04:14.108 04:54:34 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:04:14.108 04:54:34 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:04:14.108 04:54:34 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:04:14.108 04:54:34 -- nvmf/common.sh@49 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:04:14.108 04:54:34 -- scripts/common.sh@15 -- # shopt -s extglob 00:04:14.108 04:54:34 -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:04:14.108 04:54:34 -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:04:14.108 04:54:34 -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:04:14.108 04:54:34 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:14.108 04:54:34 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:14.108 04:54:34 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:14.108 04:54:34 -- paths/export.sh@5 -- # export PATH 00:04:14.108 04:54:34 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:04:14.108 04:54:34 -- nvmf/common.sh@51 -- # : 0 00:04:14.108 04:54:34 -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:04:14.108 04:54:34 -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:04:14.108 04:54:34 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:04:14.108 04:54:34 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:04:14.108 04:54:34 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:04:14.108 04:54:34 -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:04:14.108 /home/vagrant/spdk_repo/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:04:14.108 04:54:34 -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:04:14.108 04:54:34 -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:04:14.108 04:54:34 -- nvmf/common.sh@55 -- # have_pci_nics=0 00:04:14.108 04:54:34 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:04:14.108 04:54:34 -- spdk/autotest.sh@32 -- # uname -s 00:04:14.108 04:54:34 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:04:14.108 04:54:34 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:04:14.108 04:54:34 -- spdk/autotest.sh@34 -- # mkdir -p /home/vagrant/spdk_repo/spdk/../output/coredumps 00:04:14.108 04:54:34 -- spdk/autotest.sh@39 -- # echo '|/home/vagrant/spdk_repo/spdk/scripts/core-collector.sh %P %s %t' 00:04:14.108 04:54:34 -- spdk/autotest.sh@40 -- # echo /home/vagrant/spdk_repo/spdk/../output/coredumps 00:04:14.108 04:54:34 -- spdk/autotest.sh@44 -- # modprobe nbd 00:04:14.367 04:54:34 -- spdk/autotest.sh@46 -- # type -P udevadm 00:04:14.367 04:54:34 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:04:14.367 04:54:34 -- spdk/autotest.sh@48 -- # udevadm_pid=68335 00:04:14.367 04:54:34 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:04:14.367 04:54:34 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:04:14.367 04:54:34 -- pm/common@17 -- # local monitor 00:04:14.367 04:54:34 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:14.367 04:54:34 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:04:14.367 04:54:34 -- pm/common@25 -- # sleep 1 00:04:14.367 04:54:34 -- pm/common@21 -- # date +%s 00:04:14.367 04:54:34 -- pm/common@21 -- # date +%s 00:04:14.367 04:54:34 -- pm/common@21 -- # /home/vagrant/spdk_repo/spdk/scripts/perf/pm/collect-cpu-load -d /home/vagrant/spdk_repo/spdk/../output/power -l -p monitor.autotest.sh.1734238474 00:04:14.367 04:54:34 -- pm/common@21 -- # /home/vagrant/spdk_repo/spdk/scripts/perf/pm/collect-vmstat -d /home/vagrant/spdk_repo/spdk/../output/power -l -p monitor.autotest.sh.1734238474 00:04:14.367 Redirecting to /home/vagrant/spdk_repo/spdk/../output/power/monitor.autotest.sh.1734238474_collect-vmstat.pm.log 00:04:14.367 Redirecting to /home/vagrant/spdk_repo/spdk/../output/power/monitor.autotest.sh.1734238474_collect-cpu-load.pm.log 00:04:15.302 04:54:35 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:04:15.302 04:54:35 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:04:15.302 04:54:35 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:15.302 04:54:35 -- common/autotest_common.sh@10 -- # set +x 00:04:15.302 04:54:35 -- spdk/autotest.sh@59 -- # create_test_list 00:04:15.302 04:54:35 -- common/autotest_common.sh@752 -- # xtrace_disable 00:04:15.302 04:54:35 -- common/autotest_common.sh@10 -- # set +x 00:04:15.302 04:54:35 -- spdk/autotest.sh@61 -- # dirname /home/vagrant/spdk_repo/spdk/autotest.sh 00:04:15.302 04:54:35 -- spdk/autotest.sh@61 -- # readlink -f /home/vagrant/spdk_repo/spdk 00:04:15.302 04:54:35 -- spdk/autotest.sh@61 -- # src=/home/vagrant/spdk_repo/spdk 00:04:15.302 04:54:35 -- spdk/autotest.sh@62 -- # out=/home/vagrant/spdk_repo/spdk/../output 00:04:15.302 04:54:35 -- spdk/autotest.sh@63 -- # cd /home/vagrant/spdk_repo/spdk 00:04:15.302 04:54:35 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:04:15.302 04:54:35 -- common/autotest_common.sh@1457 -- # uname 00:04:15.302 04:54:35 -- common/autotest_common.sh@1457 -- # '[' Linux = FreeBSD ']' 00:04:15.302 04:54:35 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:04:15.302 04:54:35 -- common/autotest_common.sh@1477 -- # uname 00:04:15.302 04:54:35 -- common/autotest_common.sh@1477 -- # [[ Linux = FreeBSD ]] 00:04:15.302 04:54:35 -- spdk/autotest.sh@68 -- # [[ y == y ]] 00:04:15.302 04:54:35 -- spdk/autotest.sh@70 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 --version 00:04:15.302 lcov: LCOV version 1.15 00:04:15.302 04:54:35 -- spdk/autotest.sh@72 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 -q -c --no-external -i -t Baseline -d /home/vagrant/spdk_repo/spdk -o /home/vagrant/spdk_repo/spdk/../output/cov_base.info 00:04:30.169 /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:04:30.169 geninfo: WARNING: GCOV did not produce any data for /home/vagrant/spdk_repo/spdk/lib/nvme/nvme_stubs.gcno 00:04:45.061 04:55:03 -- spdk/autotest.sh@76 -- # timing_enter pre_cleanup 00:04:45.061 04:55:03 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:45.061 04:55:03 -- common/autotest_common.sh@10 -- # set +x 00:04:45.061 04:55:03 -- spdk/autotest.sh@78 -- # rm -f 00:04:45.061 04:55:03 -- spdk/autotest.sh@81 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:04:45.061 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:04:45.061 0000:00:11.0 (1b36 0010): Already using the nvme driver 00:04:45.061 0000:00:10.0 (1b36 0010): Already using the nvme driver 00:04:45.061 0000:00:12.0 (1b36 0010): Already using the nvme driver 00:04:45.061 0000:00:13.0 (1b36 0010): Already using the nvme driver 00:04:45.061 04:55:04 -- spdk/autotest.sh@83 -- # get_zoned_devs 00:04:45.061 04:55:04 -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:04:45.061 04:55:04 -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:04:45.061 04:55:04 -- common/autotest_common.sh@1658 -- # zoned_ctrls=() 00:04:45.061 04:55:04 -- common/autotest_common.sh@1658 -- # local -A zoned_ctrls 00:04:45.061 04:55:04 -- common/autotest_common.sh@1659 -- # local nvme bdf ns 00:04:45.061 04:55:04 -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:04:45.061 04:55:04 -- common/autotest_common.sh@1669 -- # bdf=0000:00:11.0 00:04:45.061 04:55:04 -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:04:45.061 04:55:04 -- common/autotest_common.sh@1671 -- # is_block_zoned nvme0n1 00:04:45.061 04:55:04 -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:04:45.061 04:55:04 -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:45.061 04:55:04 -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:45.061 04:55:04 -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:04:45.061 04:55:04 -- common/autotest_common.sh@1669 -- # bdf=0000:00:12.0 00:04:45.061 04:55:04 -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:04:45.061 04:55:04 -- common/autotest_common.sh@1671 -- # is_block_zoned nvme1n1 00:04:45.061 04:55:04 -- common/autotest_common.sh@1650 -- # local device=nvme1n1 00:04:45.061 04:55:04 -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:04:45.061 04:55:04 -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:45.061 04:55:04 -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:04:45.061 04:55:04 -- common/autotest_common.sh@1671 -- # is_block_zoned nvme1n2 00:04:45.061 04:55:04 -- common/autotest_common.sh@1650 -- # local device=nvme1n2 00:04:45.061 04:55:04 -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme1n2/queue/zoned ]] 00:04:45.061 04:55:04 -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:45.061 04:55:04 -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:04:45.061 04:55:04 -- common/autotest_common.sh@1671 -- # is_block_zoned nvme1n3 00:04:45.061 04:55:04 -- common/autotest_common.sh@1650 -- # local device=nvme1n3 00:04:45.061 04:55:04 -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme1n3/queue/zoned ]] 00:04:45.061 04:55:04 -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:45.061 04:55:04 -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:04:45.061 04:55:04 -- common/autotest_common.sh@1669 -- # bdf=0000:00:13.0 00:04:45.061 04:55:04 -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:04:45.061 04:55:04 -- common/autotest_common.sh@1671 -- # is_block_zoned nvme2c2n1 00:04:45.061 04:55:04 -- common/autotest_common.sh@1650 -- # local device=nvme2c2n1 00:04:45.061 04:55:04 -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme2c2n1/queue/zoned ]] 00:04:45.061 04:55:04 -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:45.061 04:55:04 -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:04:45.061 04:55:04 -- common/autotest_common.sh@1669 -- # bdf=0000:00:10.0 00:04:45.061 04:55:04 -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:04:45.061 04:55:04 -- common/autotest_common.sh@1671 -- # is_block_zoned nvme3n1 00:04:45.061 04:55:04 -- common/autotest_common.sh@1650 -- # local device=nvme3n1 00:04:45.061 04:55:04 -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme3n1/queue/zoned ]] 00:04:45.061 04:55:04 -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:04:45.061 04:55:04 -- spdk/autotest.sh@85 -- # (( 0 > 0 )) 00:04:45.061 04:55:04 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:45.061 04:55:04 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:45.061 04:55:04 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme0n1 00:04:45.061 04:55:04 -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:04:45.061 04:55:04 -- scripts/common.sh@390 -- # /home/vagrant/spdk_repo/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:45.061 No valid GPT data, bailing 00:04:45.061 04:55:04 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:45.061 04:55:04 -- scripts/common.sh@394 -- # pt= 00:04:45.061 04:55:04 -- scripts/common.sh@395 -- # return 1 00:04:45.061 04:55:04 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:45.061 1+0 records in 00:04:45.061 1+0 records out 00:04:45.061 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00445331 s, 235 MB/s 00:04:45.061 04:55:04 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:45.061 04:55:04 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:45.061 04:55:04 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme1n1 00:04:45.061 04:55:04 -- scripts/common.sh@381 -- # local block=/dev/nvme1n1 pt 00:04:45.061 04:55:04 -- scripts/common.sh@390 -- # /home/vagrant/spdk_repo/spdk/scripts/spdk-gpt.py /dev/nvme1n1 00:04:45.061 No valid GPT data, bailing 00:04:45.061 04:55:04 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme1n1 00:04:45.061 04:55:04 -- scripts/common.sh@394 -- # pt= 00:04:45.061 04:55:04 -- scripts/common.sh@395 -- # return 1 00:04:45.061 04:55:04 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme1n1 bs=1M count=1 00:04:45.061 1+0 records in 00:04:45.061 1+0 records out 00:04:45.061 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00564553 s, 186 MB/s 00:04:45.061 04:55:04 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:45.061 04:55:04 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:45.061 04:55:04 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme1n2 00:04:45.061 04:55:04 -- scripts/common.sh@381 -- # local block=/dev/nvme1n2 pt 00:04:45.061 04:55:04 -- scripts/common.sh@390 -- # /home/vagrant/spdk_repo/spdk/scripts/spdk-gpt.py /dev/nvme1n2 00:04:45.061 No valid GPT data, bailing 00:04:45.061 04:55:04 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme1n2 00:04:45.061 04:55:04 -- scripts/common.sh@394 -- # pt= 00:04:45.061 04:55:04 -- scripts/common.sh@395 -- # return 1 00:04:45.061 04:55:04 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme1n2 bs=1M count=1 00:04:45.061 1+0 records in 00:04:45.061 1+0 records out 00:04:45.061 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00543634 s, 193 MB/s 00:04:45.061 04:55:04 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:45.061 04:55:04 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:45.061 04:55:04 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme1n3 00:04:45.061 04:55:04 -- scripts/common.sh@381 -- # local block=/dev/nvme1n3 pt 00:04:45.061 04:55:04 -- scripts/common.sh@390 -- # /home/vagrant/spdk_repo/spdk/scripts/spdk-gpt.py /dev/nvme1n3 00:04:45.061 No valid GPT data, bailing 00:04:45.061 04:55:04 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme1n3 00:04:45.062 04:55:04 -- scripts/common.sh@394 -- # pt= 00:04:45.062 04:55:04 -- scripts/common.sh@395 -- # return 1 00:04:45.062 04:55:04 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme1n3 bs=1M count=1 00:04:45.062 1+0 records in 00:04:45.062 1+0 records out 00:04:45.062 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00911822 s, 115 MB/s 00:04:45.062 04:55:04 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:45.062 04:55:04 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:45.062 04:55:04 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme2n1 00:04:45.062 04:55:04 -- scripts/common.sh@381 -- # local block=/dev/nvme2n1 pt 00:04:45.062 04:55:04 -- scripts/common.sh@390 -- # /home/vagrant/spdk_repo/spdk/scripts/spdk-gpt.py /dev/nvme2n1 00:04:45.062 No valid GPT data, bailing 00:04:45.062 04:55:04 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme2n1 00:04:45.062 04:55:04 -- scripts/common.sh@394 -- # pt= 00:04:45.062 04:55:04 -- scripts/common.sh@395 -- # return 1 00:04:45.062 04:55:04 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme2n1 bs=1M count=1 00:04:45.062 1+0 records in 00:04:45.062 1+0 records out 00:04:45.062 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.005688 s, 184 MB/s 00:04:45.062 04:55:04 -- spdk/autotest.sh@97 -- # for dev in /dev/nvme*n!(*p*) 00:04:45.062 04:55:04 -- spdk/autotest.sh@99 -- # [[ -z '' ]] 00:04:45.062 04:55:04 -- spdk/autotest.sh@100 -- # block_in_use /dev/nvme3n1 00:04:45.062 04:55:04 -- scripts/common.sh@381 -- # local block=/dev/nvme3n1 pt 00:04:45.062 04:55:04 -- scripts/common.sh@390 -- # /home/vagrant/spdk_repo/spdk/scripts/spdk-gpt.py /dev/nvme3n1 00:04:45.062 No valid GPT data, bailing 00:04:45.062 04:55:04 -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme3n1 00:04:45.062 04:55:04 -- scripts/common.sh@394 -- # pt= 00:04:45.062 04:55:04 -- scripts/common.sh@395 -- # return 1 00:04:45.062 04:55:04 -- spdk/autotest.sh@101 -- # dd if=/dev/zero of=/dev/nvme3n1 bs=1M count=1 00:04:45.062 1+0 records in 00:04:45.062 1+0 records out 00:04:45.062 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0261021 s, 40.2 MB/s 00:04:45.062 04:55:04 -- spdk/autotest.sh@105 -- # sync 00:04:45.062 04:55:04 -- spdk/autotest.sh@107 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:45.062 04:55:04 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:45.062 04:55:04 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:46.469 04:55:06 -- spdk/autotest.sh@111 -- # uname -s 00:04:46.469 04:55:06 -- spdk/autotest.sh@111 -- # [[ Linux == Linux ]] 00:04:46.469 04:55:06 -- spdk/autotest.sh@111 -- # [[ 0 -eq 1 ]] 00:04:46.469 04:55:06 -- spdk/autotest.sh@115 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh status 00:04:47.035 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:04:47.294 Hugepages 00:04:47.294 node hugesize free / total 00:04:47.552 node0 1048576kB 0 / 0 00:04:47.552 node0 2048kB 0 / 0 00:04:47.552 00:04:47.552 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:47.552 virtio 0000:00:03.0 1af4 1001 unknown virtio-pci - vda 00:04:47.552 NVMe 0000:00:10.0 1b36 0010 unknown nvme nvme3 nvme3n1 00:04:47.552 NVMe 0000:00:11.0 1b36 0010 unknown nvme nvme0 nvme0n1 00:04:47.810 NVMe 0000:00:12.0 1b36 0010 unknown nvme nvme1 nvme1n1 nvme1n2 nvme1n3 00:04:47.810 NVMe 0000:00:13.0 1b36 0010 unknown nvme nvme2 nvme2n1 00:04:47.810 04:55:07 -- spdk/autotest.sh@117 -- # uname -s 00:04:47.810 04:55:07 -- spdk/autotest.sh@117 -- # [[ Linux == Linux ]] 00:04:47.810 04:55:07 -- spdk/autotest.sh@119 -- # nvme_namespace_revert 00:04:47.810 04:55:07 -- common/autotest_common.sh@1516 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:04:48.377 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:04:48.636 0000:00:11.0 (1b36 0010): nvme -> uio_pci_generic 00:04:48.636 0000:00:10.0 (1b36 0010): nvme -> uio_pci_generic 00:04:48.893 0000:00:13.0 (1b36 0010): nvme -> uio_pci_generic 00:04:48.894 0000:00:12.0 (1b36 0010): nvme -> uio_pci_generic 00:04:48.894 04:55:08 -- common/autotest_common.sh@1517 -- # sleep 1 00:04:49.828 04:55:09 -- common/autotest_common.sh@1518 -- # bdfs=() 00:04:49.828 04:55:09 -- common/autotest_common.sh@1518 -- # local bdfs 00:04:49.828 04:55:09 -- common/autotest_common.sh@1520 -- # bdfs=($(get_nvme_bdfs)) 00:04:49.828 04:55:09 -- common/autotest_common.sh@1520 -- # get_nvme_bdfs 00:04:49.828 04:55:09 -- common/autotest_common.sh@1498 -- # bdfs=() 00:04:49.828 04:55:09 -- common/autotest_common.sh@1498 -- # local bdfs 00:04:49.828 04:55:09 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:49.828 04:55:09 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:04:49.828 04:55:09 -- common/autotest_common.sh@1499 -- # /home/vagrant/spdk_repo/spdk/scripts/gen_nvme.sh 00:04:49.828 04:55:09 -- common/autotest_common.sh@1500 -- # (( 4 == 0 )) 00:04:49.828 04:55:09 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:00:10.0 0000:00:11.0 0000:00:12.0 0000:00:13.0 00:04:49.828 04:55:09 -- common/autotest_common.sh@1522 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:04:50.085 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:04:50.342 Waiting for block devices as requested 00:04:50.342 0000:00:11.0 (1b36 0010): uio_pci_generic -> nvme 00:04:50.342 0000:00:10.0 (1b36 0010): uio_pci_generic -> nvme 00:04:50.600 0000:00:12.0 (1b36 0010): uio_pci_generic -> nvme 00:04:50.600 0000:00:13.0 (1b36 0010): uio_pci_generic -> nvme 00:04:55.871 * Events for some block/disk devices (0000:00:13.0) were not caught, they may be missing 00:04:55.871 04:55:15 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:04:55.871 04:55:15 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:00:10.0 00:04:55.871 04:55:15 -- common/autotest_common.sh@1487 -- # grep 0000:00:10.0/nvme/nvme 00:04:55.871 04:55:15 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 /sys/class/nvme/nvme1 /sys/class/nvme/nvme2 /sys/class/nvme/nvme3 00:04:55.871 04:55:15 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:00/0000:00:10.0/nvme/nvme1 00:04:55.871 04:55:15 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:00/0000:00:10.0/nvme/nvme1 ]] 00:04:55.871 04:55:15 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:00/0000:00:10.0/nvme/nvme1 00:04:55.871 04:55:15 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme1 00:04:55.871 04:55:15 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme1 00:04:55.871 04:55:15 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme1 ]] 00:04:55.871 04:55:15 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme1 00:04:55.871 04:55:15 -- common/autotest_common.sh@1531 -- # grep oacs 00:04:55.871 04:55:15 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:04:55.871 04:55:15 -- common/autotest_common.sh@1531 -- # oacs=' 0x12a' 00:04:55.871 04:55:15 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:04:55.871 04:55:15 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:04:55.871 04:55:15 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme1 00:04:55.871 04:55:15 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:04:55.871 04:55:15 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:04:55.871 04:55:15 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:04:55.871 04:55:15 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:04:55.871 04:55:15 -- common/autotest_common.sh@1543 -- # continue 00:04:55.871 04:55:15 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:04:55.871 04:55:15 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:00:11.0 00:04:55.871 04:55:15 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 /sys/class/nvme/nvme1 /sys/class/nvme/nvme2 /sys/class/nvme/nvme3 00:04:55.871 04:55:15 -- common/autotest_common.sh@1487 -- # grep 0000:00:11.0/nvme/nvme 00:04:55.871 04:55:15 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:00/0000:00:11.0/nvme/nvme0 00:04:55.871 04:55:15 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:00/0000:00:11.0/nvme/nvme0 ]] 00:04:55.871 04:55:15 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:00/0000:00:11.0/nvme/nvme0 00:04:55.871 04:55:15 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme0 00:04:55.871 04:55:15 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme0 00:04:55.871 04:55:15 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme0 ]] 00:04:55.871 04:55:15 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme0 00:04:55.871 04:55:15 -- common/autotest_common.sh@1531 -- # grep oacs 00:04:55.871 04:55:15 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:04:55.871 04:55:15 -- common/autotest_common.sh@1531 -- # oacs=' 0x12a' 00:04:55.871 04:55:15 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:04:55.871 04:55:15 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:04:55.871 04:55:15 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:04:55.871 04:55:15 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:04:55.871 04:55:15 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme0 00:04:55.871 04:55:15 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:04:55.871 04:55:15 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:04:55.871 04:55:15 -- common/autotest_common.sh@1543 -- # continue 00:04:55.871 04:55:15 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:04:55.871 04:55:15 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:00:12.0 00:04:55.871 04:55:15 -- common/autotest_common.sh@1487 -- # grep 0000:00:12.0/nvme/nvme 00:04:55.872 04:55:15 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 /sys/class/nvme/nvme1 /sys/class/nvme/nvme2 /sys/class/nvme/nvme3 00:04:55.872 04:55:15 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:00/0000:00:12.0/nvme/nvme2 00:04:55.872 04:55:15 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:00/0000:00:12.0/nvme/nvme2 ]] 00:04:55.872 04:55:15 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:00/0000:00:12.0/nvme/nvme2 00:04:55.872 04:55:15 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme2 00:04:55.872 04:55:15 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme2 00:04:55.872 04:55:15 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme2 ]] 00:04:55.872 04:55:15 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme2 00:04:55.872 04:55:15 -- common/autotest_common.sh@1531 -- # grep oacs 00:04:55.872 04:55:15 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:04:55.872 04:55:15 -- common/autotest_common.sh@1531 -- # oacs=' 0x12a' 00:04:55.872 04:55:15 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:04:55.872 04:55:15 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:04:55.872 04:55:15 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme2 00:04:55.872 04:55:15 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:04:55.872 04:55:15 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:04:55.872 04:55:15 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:04:55.872 04:55:15 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:04:55.872 04:55:15 -- common/autotest_common.sh@1543 -- # continue 00:04:55.872 04:55:15 -- common/autotest_common.sh@1524 -- # for bdf in "${bdfs[@]}" 00:04:55.872 04:55:15 -- common/autotest_common.sh@1525 -- # get_nvme_ctrlr_from_bdf 0000:00:13.0 00:04:55.872 04:55:15 -- common/autotest_common.sh@1487 -- # readlink -f /sys/class/nvme/nvme0 /sys/class/nvme/nvme1 /sys/class/nvme/nvme2 /sys/class/nvme/nvme3 00:04:55.872 04:55:15 -- common/autotest_common.sh@1487 -- # grep 0000:00:13.0/nvme/nvme 00:04:55.872 04:55:15 -- common/autotest_common.sh@1487 -- # bdf_sysfs_path=/sys/devices/pci0000:00/0000:00:13.0/nvme/nvme3 00:04:55.872 04:55:15 -- common/autotest_common.sh@1488 -- # [[ -z /sys/devices/pci0000:00/0000:00:13.0/nvme/nvme3 ]] 00:04:55.872 04:55:15 -- common/autotest_common.sh@1492 -- # basename /sys/devices/pci0000:00/0000:00:13.0/nvme/nvme3 00:04:55.872 04:55:15 -- common/autotest_common.sh@1492 -- # printf '%s\n' nvme3 00:04:55.872 04:55:15 -- common/autotest_common.sh@1525 -- # nvme_ctrlr=/dev/nvme3 00:04:55.872 04:55:15 -- common/autotest_common.sh@1526 -- # [[ -z /dev/nvme3 ]] 00:04:55.872 04:55:15 -- common/autotest_common.sh@1531 -- # cut -d: -f2 00:04:55.872 04:55:15 -- common/autotest_common.sh@1531 -- # nvme id-ctrl /dev/nvme3 00:04:55.872 04:55:15 -- common/autotest_common.sh@1531 -- # grep oacs 00:04:55.872 04:55:15 -- common/autotest_common.sh@1531 -- # oacs=' 0x12a' 00:04:55.872 04:55:15 -- common/autotest_common.sh@1532 -- # oacs_ns_manage=8 00:04:55.872 04:55:15 -- common/autotest_common.sh@1534 -- # [[ 8 -ne 0 ]] 00:04:55.872 04:55:15 -- common/autotest_common.sh@1540 -- # nvme id-ctrl /dev/nvme3 00:04:55.872 04:55:15 -- common/autotest_common.sh@1540 -- # cut -d: -f2 00:04:55.872 04:55:15 -- common/autotest_common.sh@1540 -- # grep unvmcap 00:04:55.872 04:55:15 -- common/autotest_common.sh@1540 -- # unvmcap=' 0' 00:04:55.872 04:55:15 -- common/autotest_common.sh@1541 -- # [[ 0 -eq 0 ]] 00:04:55.872 04:55:15 -- common/autotest_common.sh@1543 -- # continue 00:04:55.872 04:55:15 -- spdk/autotest.sh@122 -- # timing_exit pre_cleanup 00:04:55.872 04:55:15 -- common/autotest_common.sh@732 -- # xtrace_disable 00:04:55.872 04:55:15 -- common/autotest_common.sh@10 -- # set +x 00:04:55.872 04:55:15 -- spdk/autotest.sh@125 -- # timing_enter afterboot 00:04:55.872 04:55:15 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:55.872 04:55:15 -- common/autotest_common.sh@10 -- # set +x 00:04:55.872 04:55:15 -- spdk/autotest.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:04:56.442 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:04:56.702 0000:00:10.0 (1b36 0010): nvme -> uio_pci_generic 00:04:56.964 0000:00:13.0 (1b36 0010): nvme -> uio_pci_generic 00:04:56.964 0000:00:11.0 (1b36 0010): nvme -> uio_pci_generic 00:04:56.964 0000:00:12.0 (1b36 0010): nvme -> uio_pci_generic 00:04:56.964 04:55:16 -- spdk/autotest.sh@127 -- # timing_exit afterboot 00:04:56.964 04:55:16 -- common/autotest_common.sh@732 -- # xtrace_disable 00:04:56.964 04:55:16 -- common/autotest_common.sh@10 -- # set +x 00:04:56.964 04:55:17 -- spdk/autotest.sh@131 -- # opal_revert_cleanup 00:04:56.964 04:55:17 -- common/autotest_common.sh@1578 -- # mapfile -t bdfs 00:04:56.964 04:55:17 -- common/autotest_common.sh@1578 -- # get_nvme_bdfs_by_id 0x0a54 00:04:56.964 04:55:17 -- common/autotest_common.sh@1563 -- # bdfs=() 00:04:56.964 04:55:17 -- common/autotest_common.sh@1563 -- # _bdfs=() 00:04:56.964 04:55:17 -- common/autotest_common.sh@1563 -- # local bdfs _bdfs 00:04:56.964 04:55:17 -- common/autotest_common.sh@1564 -- # _bdfs=($(get_nvme_bdfs)) 00:04:56.964 04:55:17 -- common/autotest_common.sh@1564 -- # get_nvme_bdfs 00:04:56.964 04:55:17 -- common/autotest_common.sh@1498 -- # bdfs=() 00:04:56.964 04:55:17 -- common/autotest_common.sh@1498 -- # local bdfs 00:04:56.964 04:55:17 -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:56.965 04:55:17 -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:04:56.965 04:55:17 -- common/autotest_common.sh@1499 -- # /home/vagrant/spdk_repo/spdk/scripts/gen_nvme.sh 00:04:57.226 04:55:17 -- common/autotest_common.sh@1500 -- # (( 4 == 0 )) 00:04:57.226 04:55:17 -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:00:10.0 0000:00:11.0 0000:00:12.0 0000:00:13.0 00:04:57.226 04:55:17 -- common/autotest_common.sh@1565 -- # for bdf in "${_bdfs[@]}" 00:04:57.226 04:55:17 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:00:10.0/device 00:04:57.226 04:55:17 -- common/autotest_common.sh@1566 -- # device=0x0010 00:04:57.226 04:55:17 -- common/autotest_common.sh@1567 -- # [[ 0x0010 == \0\x\0\a\5\4 ]] 00:04:57.226 04:55:17 -- common/autotest_common.sh@1565 -- # for bdf in "${_bdfs[@]}" 00:04:57.226 04:55:17 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:00:11.0/device 00:04:57.226 04:55:17 -- common/autotest_common.sh@1566 -- # device=0x0010 00:04:57.226 04:55:17 -- common/autotest_common.sh@1567 -- # [[ 0x0010 == \0\x\0\a\5\4 ]] 00:04:57.226 04:55:17 -- common/autotest_common.sh@1565 -- # for bdf in "${_bdfs[@]}" 00:04:57.226 04:55:17 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:00:12.0/device 00:04:57.226 04:55:17 -- common/autotest_common.sh@1566 -- # device=0x0010 00:04:57.226 04:55:17 -- common/autotest_common.sh@1567 -- # [[ 0x0010 == \0\x\0\a\5\4 ]] 00:04:57.226 04:55:17 -- common/autotest_common.sh@1565 -- # for bdf in "${_bdfs[@]}" 00:04:57.226 04:55:17 -- common/autotest_common.sh@1566 -- # cat /sys/bus/pci/devices/0000:00:13.0/device 00:04:57.226 04:55:17 -- common/autotest_common.sh@1566 -- # device=0x0010 00:04:57.226 04:55:17 -- common/autotest_common.sh@1567 -- # [[ 0x0010 == \0\x\0\a\5\4 ]] 00:04:57.226 04:55:17 -- common/autotest_common.sh@1572 -- # (( 0 > 0 )) 00:04:57.226 04:55:17 -- common/autotest_common.sh@1572 -- # return 0 00:04:57.226 04:55:17 -- common/autotest_common.sh@1579 -- # [[ -z '' ]] 00:04:57.226 04:55:17 -- common/autotest_common.sh@1580 -- # return 0 00:04:57.226 04:55:17 -- spdk/autotest.sh@137 -- # '[' 0 -eq 1 ']' 00:04:57.226 04:55:17 -- spdk/autotest.sh@141 -- # '[' 1 -eq 1 ']' 00:04:57.226 04:55:17 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:04:57.226 04:55:17 -- spdk/autotest.sh@142 -- # [[ 0 -eq 1 ]] 00:04:57.226 04:55:17 -- spdk/autotest.sh@149 -- # timing_enter lib 00:04:57.226 04:55:17 -- common/autotest_common.sh@726 -- # xtrace_disable 00:04:57.226 04:55:17 -- common/autotest_common.sh@10 -- # set +x 00:04:57.226 04:55:17 -- spdk/autotest.sh@151 -- # [[ 0 -eq 1 ]] 00:04:57.226 04:55:17 -- spdk/autotest.sh@155 -- # run_test env /home/vagrant/spdk_repo/spdk/test/env/env.sh 00:04:57.226 04:55:17 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:57.226 04:55:17 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:57.226 04:55:17 -- common/autotest_common.sh@10 -- # set +x 00:04:57.226 ************************************ 00:04:57.226 START TEST env 00:04:57.226 ************************************ 00:04:57.226 04:55:17 env -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/env/env.sh 00:04:57.226 * Looking for test storage... 00:04:57.226 * Found test storage at /home/vagrant/spdk_repo/spdk/test/env 00:04:57.226 04:55:17 env -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:04:57.226 04:55:17 env -- common/autotest_common.sh@1711 -- # lcov --version 00:04:57.226 04:55:17 env -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:04:57.226 04:55:17 env -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:04:57.226 04:55:17 env -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:04:57.226 04:55:17 env -- scripts/common.sh@333 -- # local ver1 ver1_l 00:04:57.226 04:55:17 env -- scripts/common.sh@334 -- # local ver2 ver2_l 00:04:57.226 04:55:17 env -- scripts/common.sh@336 -- # IFS=.-: 00:04:57.226 04:55:17 env -- scripts/common.sh@336 -- # read -ra ver1 00:04:57.226 04:55:17 env -- scripts/common.sh@337 -- # IFS=.-: 00:04:57.226 04:55:17 env -- scripts/common.sh@337 -- # read -ra ver2 00:04:57.226 04:55:17 env -- scripts/common.sh@338 -- # local 'op=<' 00:04:57.226 04:55:17 env -- scripts/common.sh@340 -- # ver1_l=2 00:04:57.226 04:55:17 env -- scripts/common.sh@341 -- # ver2_l=1 00:04:57.226 04:55:17 env -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:04:57.226 04:55:17 env -- scripts/common.sh@344 -- # case "$op" in 00:04:57.227 04:55:17 env -- scripts/common.sh@345 -- # : 1 00:04:57.227 04:55:17 env -- scripts/common.sh@364 -- # (( v = 0 )) 00:04:57.227 04:55:17 env -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:04:57.227 04:55:17 env -- scripts/common.sh@365 -- # decimal 1 00:04:57.227 04:55:17 env -- scripts/common.sh@353 -- # local d=1 00:04:57.227 04:55:17 env -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:04:57.227 04:55:17 env -- scripts/common.sh@355 -- # echo 1 00:04:57.227 04:55:17 env -- scripts/common.sh@365 -- # ver1[v]=1 00:04:57.227 04:55:17 env -- scripts/common.sh@366 -- # decimal 2 00:04:57.227 04:55:17 env -- scripts/common.sh@353 -- # local d=2 00:04:57.227 04:55:17 env -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:04:57.227 04:55:17 env -- scripts/common.sh@355 -- # echo 2 00:04:57.227 04:55:17 env -- scripts/common.sh@366 -- # ver2[v]=2 00:04:57.227 04:55:17 env -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:04:57.227 04:55:17 env -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:04:57.227 04:55:17 env -- scripts/common.sh@368 -- # return 0 00:04:57.227 04:55:17 env -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:04:57.227 04:55:17 env -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:04:57.227 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:57.227 --rc genhtml_branch_coverage=1 00:04:57.227 --rc genhtml_function_coverage=1 00:04:57.227 --rc genhtml_legend=1 00:04:57.227 --rc geninfo_all_blocks=1 00:04:57.227 --rc geninfo_unexecuted_blocks=1 00:04:57.227 00:04:57.227 ' 00:04:57.227 04:55:17 env -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:04:57.227 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:57.227 --rc genhtml_branch_coverage=1 00:04:57.227 --rc genhtml_function_coverage=1 00:04:57.227 --rc genhtml_legend=1 00:04:57.227 --rc geninfo_all_blocks=1 00:04:57.227 --rc geninfo_unexecuted_blocks=1 00:04:57.227 00:04:57.227 ' 00:04:57.227 04:55:17 env -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:04:57.227 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:57.227 --rc genhtml_branch_coverage=1 00:04:57.227 --rc genhtml_function_coverage=1 00:04:57.227 --rc genhtml_legend=1 00:04:57.227 --rc geninfo_all_blocks=1 00:04:57.227 --rc geninfo_unexecuted_blocks=1 00:04:57.227 00:04:57.227 ' 00:04:57.227 04:55:17 env -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:04:57.227 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:04:57.227 --rc genhtml_branch_coverage=1 00:04:57.227 --rc genhtml_function_coverage=1 00:04:57.227 --rc genhtml_legend=1 00:04:57.227 --rc geninfo_all_blocks=1 00:04:57.227 --rc geninfo_unexecuted_blocks=1 00:04:57.227 00:04:57.227 ' 00:04:57.227 04:55:17 env -- env/env.sh@10 -- # run_test env_memory /home/vagrant/spdk_repo/spdk/test/env/memory/memory_ut 00:04:57.227 04:55:17 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:57.227 04:55:17 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:57.227 04:55:17 env -- common/autotest_common.sh@10 -- # set +x 00:04:57.227 ************************************ 00:04:57.227 START TEST env_memory 00:04:57.227 ************************************ 00:04:57.227 04:55:17 env.env_memory -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/env/memory/memory_ut 00:04:57.487 00:04:57.488 00:04:57.488 CUnit - A unit testing framework for C - Version 2.1-3 00:04:57.488 http://cunit.sourceforge.net/ 00:04:57.488 00:04:57.488 00:04:57.488 Suite: memory 00:04:57.488 Test: alloc and free memory map ...[2024-12-15 04:55:17.400033] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:04:57.488 passed 00:04:57.488 Test: mem map translation ...[2024-12-15 04:55:17.439247] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/memory.c: 595:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:04:57.488 [2024-12-15 04:55:17.439390] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/memory.c: 595:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:04:57.488 [2024-12-15 04:55:17.439524] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/memory.c: 589:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:04:57.488 [2024-12-15 04:55:17.439567] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/memory.c: 605:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:04:57.488 passed 00:04:57.488 Test: mem map registration ...[2024-12-15 04:55:17.508226] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=200000 len=1234 00:04:57.488 [2024-12-15 04:55:17.508368] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/memory.c: 347:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=4d2 len=2097152 00:04:57.488 passed 00:04:57.488 Test: mem map adjacent registrations ...passed 00:04:57.488 00:04:57.488 Run Summary: Type Total Ran Passed Failed Inactive 00:04:57.488 suites 1 1 n/a 0 0 00:04:57.488 tests 4 4 4 0 0 00:04:57.488 asserts 152 152 152 0 n/a 00:04:57.488 00:04:57.488 Elapsed time = 0.234 seconds 00:04:57.488 00:04:57.488 real 0m0.271s 00:04:57.488 user 0m0.245s 00:04:57.488 sys 0m0.018s 00:04:57.488 04:55:17 env.env_memory -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:57.488 04:55:17 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:04:57.488 ************************************ 00:04:57.488 END TEST env_memory 00:04:57.488 ************************************ 00:04:57.748 04:55:17 env -- env/env.sh@11 -- # run_test env_vtophys /home/vagrant/spdk_repo/spdk/test/env/vtophys/vtophys 00:04:57.748 04:55:17 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:57.748 04:55:17 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:57.748 04:55:17 env -- common/autotest_common.sh@10 -- # set +x 00:04:57.748 ************************************ 00:04:57.748 START TEST env_vtophys 00:04:57.748 ************************************ 00:04:57.748 04:55:17 env.env_vtophys -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/env/vtophys/vtophys 00:04:57.748 EAL: lib.eal log level changed from notice to debug 00:04:57.748 EAL: Detected lcore 0 as core 0 on socket 0 00:04:57.749 EAL: Detected lcore 1 as core 0 on socket 0 00:04:57.749 EAL: Detected lcore 2 as core 0 on socket 0 00:04:57.749 EAL: Detected lcore 3 as core 0 on socket 0 00:04:57.749 EAL: Detected lcore 4 as core 0 on socket 0 00:04:57.749 EAL: Detected lcore 5 as core 0 on socket 0 00:04:57.749 EAL: Detected lcore 6 as core 0 on socket 0 00:04:57.749 EAL: Detected lcore 7 as core 0 on socket 0 00:04:57.749 EAL: Detected lcore 8 as core 0 on socket 0 00:04:57.749 EAL: Detected lcore 9 as core 0 on socket 0 00:04:57.749 EAL: Maximum logical cores by configuration: 128 00:04:57.749 EAL: Detected CPU lcores: 10 00:04:57.749 EAL: Detected NUMA nodes: 1 00:04:57.749 EAL: Checking presence of .so 'librte_eal.so.24.0' 00:04:57.749 EAL: Detected shared linkage of DPDK 00:04:57.749 EAL: open shared lib /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so.24.0 00:04:57.749 EAL: open shared lib /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so.24.0 00:04:57.749 EAL: Registered [vdev] bus. 00:04:57.749 EAL: bus.vdev log level changed from disabled to notice 00:04:57.749 EAL: open shared lib /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so.24.0 00:04:57.749 EAL: open shared lib /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so.24.0 00:04:57.749 EAL: pmd.net.i40e.init log level changed from disabled to notice 00:04:57.749 EAL: pmd.net.i40e.driver log level changed from disabled to notice 00:04:57.749 EAL: open shared lib /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_pci.so 00:04:57.749 EAL: open shared lib /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_bus_vdev.so 00:04:57.749 EAL: open shared lib /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_mempool_ring.so 00:04:57.749 EAL: open shared lib /home/vagrant/spdk_repo/dpdk/build/lib/dpdk/pmds-24.0/librte_net_i40e.so 00:04:57.749 EAL: No shared files mode enabled, IPC will be disabled 00:04:57.749 EAL: No shared files mode enabled, IPC is disabled 00:04:57.749 EAL: Selected IOVA mode 'PA' 00:04:57.749 EAL: Probing VFIO support... 00:04:57.749 EAL: Module /sys/module/vfio not found! error 2 (No such file or directory) 00:04:57.749 EAL: VFIO modules not loaded, skipping VFIO support... 00:04:57.749 EAL: Ask a virtual area of 0x2e000 bytes 00:04:57.749 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:04:57.749 EAL: Setting up physically contiguous memory... 00:04:57.749 EAL: Setting maximum number of open files to 524288 00:04:57.749 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:04:57.749 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:04:57.749 EAL: Ask a virtual area of 0x61000 bytes 00:04:57.749 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:04:57.749 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:57.749 EAL: Ask a virtual area of 0x400000000 bytes 00:04:57.749 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:04:57.749 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:04:57.749 EAL: Ask a virtual area of 0x61000 bytes 00:04:57.749 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:04:57.749 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:57.749 EAL: Ask a virtual area of 0x400000000 bytes 00:04:57.749 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:04:57.749 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:04:57.749 EAL: Ask a virtual area of 0x61000 bytes 00:04:57.749 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:04:57.749 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:57.749 EAL: Ask a virtual area of 0x400000000 bytes 00:04:57.749 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:04:57.749 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:04:57.749 EAL: Ask a virtual area of 0x61000 bytes 00:04:57.749 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:04:57.749 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:04:57.749 EAL: Ask a virtual area of 0x400000000 bytes 00:04:57.749 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:04:57.749 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:04:57.749 EAL: Hugepages will be freed exactly as allocated. 00:04:57.749 EAL: No shared files mode enabled, IPC is disabled 00:04:57.749 EAL: No shared files mode enabled, IPC is disabled 00:04:57.749 EAL: TSC frequency is ~2600000 KHz 00:04:57.749 EAL: Main lcore 0 is ready (tid=7fd4b0864a40;cpuset=[0]) 00:04:57.749 EAL: Trying to obtain current memory policy. 00:04:57.749 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:57.749 EAL: Restoring previous memory policy: 0 00:04:57.749 EAL: request: mp_malloc_sync 00:04:57.749 EAL: No shared files mode enabled, IPC is disabled 00:04:57.749 EAL: Heap on socket 0 was expanded by 2MB 00:04:57.749 EAL: Module /sys/module/vfio not found! error 2 (No such file or directory) 00:04:57.749 EAL: No shared files mode enabled, IPC is disabled 00:04:57.749 EAL: No PCI address specified using 'addr=' in: bus=pci 00:04:57.749 EAL: Mem event callback 'spdk:(nil)' registered 00:04:57.749 EAL: Module /sys/module/vfio_pci not found! error 2 (No such file or directory) 00:04:58.010 00:04:58.010 00:04:58.010 CUnit - A unit testing framework for C - Version 2.1-3 00:04:58.010 http://cunit.sourceforge.net/ 00:04:58.010 00:04:58.010 00:04:58.010 Suite: components_suite 00:04:58.271 Test: vtophys_malloc_test ...passed 00:04:58.271 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:04:58.271 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.271 EAL: Restoring previous memory policy: 4 00:04:58.271 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.271 EAL: request: mp_malloc_sync 00:04:58.271 EAL: No shared files mode enabled, IPC is disabled 00:04:58.271 EAL: Heap on socket 0 was expanded by 4MB 00:04:58.271 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.271 EAL: request: mp_malloc_sync 00:04:58.271 EAL: No shared files mode enabled, IPC is disabled 00:04:58.271 EAL: Heap on socket 0 was shrunk by 4MB 00:04:58.271 EAL: Trying to obtain current memory policy. 00:04:58.271 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.271 EAL: Restoring previous memory policy: 4 00:04:58.271 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.271 EAL: request: mp_malloc_sync 00:04:58.271 EAL: No shared files mode enabled, IPC is disabled 00:04:58.271 EAL: Heap on socket 0 was expanded by 6MB 00:04:58.271 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.271 EAL: request: mp_malloc_sync 00:04:58.271 EAL: No shared files mode enabled, IPC is disabled 00:04:58.271 EAL: Heap on socket 0 was shrunk by 6MB 00:04:58.271 EAL: Trying to obtain current memory policy. 00:04:58.271 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.271 EAL: Restoring previous memory policy: 4 00:04:58.271 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.271 EAL: request: mp_malloc_sync 00:04:58.271 EAL: No shared files mode enabled, IPC is disabled 00:04:58.271 EAL: Heap on socket 0 was expanded by 10MB 00:04:58.271 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.271 EAL: request: mp_malloc_sync 00:04:58.271 EAL: No shared files mode enabled, IPC is disabled 00:04:58.271 EAL: Heap on socket 0 was shrunk by 10MB 00:04:58.271 EAL: Trying to obtain current memory policy. 00:04:58.271 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.271 EAL: Restoring previous memory policy: 4 00:04:58.271 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.271 EAL: request: mp_malloc_sync 00:04:58.271 EAL: No shared files mode enabled, IPC is disabled 00:04:58.271 EAL: Heap on socket 0 was expanded by 18MB 00:04:58.271 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.271 EAL: request: mp_malloc_sync 00:04:58.271 EAL: No shared files mode enabled, IPC is disabled 00:04:58.271 EAL: Heap on socket 0 was shrunk by 18MB 00:04:58.271 EAL: Trying to obtain current memory policy. 00:04:58.271 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.271 EAL: Restoring previous memory policy: 4 00:04:58.271 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.271 EAL: request: mp_malloc_sync 00:04:58.271 EAL: No shared files mode enabled, IPC is disabled 00:04:58.271 EAL: Heap on socket 0 was expanded by 34MB 00:04:58.271 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.271 EAL: request: mp_malloc_sync 00:04:58.271 EAL: No shared files mode enabled, IPC is disabled 00:04:58.271 EAL: Heap on socket 0 was shrunk by 34MB 00:04:58.271 EAL: Trying to obtain current memory policy. 00:04:58.271 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.271 EAL: Restoring previous memory policy: 4 00:04:58.271 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.271 EAL: request: mp_malloc_sync 00:04:58.271 EAL: No shared files mode enabled, IPC is disabled 00:04:58.271 EAL: Heap on socket 0 was expanded by 66MB 00:04:58.271 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.271 EAL: request: mp_malloc_sync 00:04:58.271 EAL: No shared files mode enabled, IPC is disabled 00:04:58.271 EAL: Heap on socket 0 was shrunk by 66MB 00:04:58.271 EAL: Trying to obtain current memory policy. 00:04:58.271 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.271 EAL: Restoring previous memory policy: 4 00:04:58.271 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.271 EAL: request: mp_malloc_sync 00:04:58.271 EAL: No shared files mode enabled, IPC is disabled 00:04:58.271 EAL: Heap on socket 0 was expanded by 130MB 00:04:58.271 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.271 EAL: request: mp_malloc_sync 00:04:58.271 EAL: No shared files mode enabled, IPC is disabled 00:04:58.271 EAL: Heap on socket 0 was shrunk by 130MB 00:04:58.271 EAL: Trying to obtain current memory policy. 00:04:58.271 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.532 EAL: Restoring previous memory policy: 4 00:04:58.532 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.532 EAL: request: mp_malloc_sync 00:04:58.532 EAL: No shared files mode enabled, IPC is disabled 00:04:58.532 EAL: Heap on socket 0 was expanded by 258MB 00:04:58.532 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.532 EAL: request: mp_malloc_sync 00:04:58.532 EAL: No shared files mode enabled, IPC is disabled 00:04:58.532 EAL: Heap on socket 0 was shrunk by 258MB 00:04:58.532 EAL: Trying to obtain current memory policy. 00:04:58.532 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.532 EAL: Restoring previous memory policy: 4 00:04:58.532 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.532 EAL: request: mp_malloc_sync 00:04:58.532 EAL: No shared files mode enabled, IPC is disabled 00:04:58.532 EAL: Heap on socket 0 was expanded by 514MB 00:04:58.532 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.792 EAL: request: mp_malloc_sync 00:04:58.792 EAL: No shared files mode enabled, IPC is disabled 00:04:58.792 EAL: Heap on socket 0 was shrunk by 514MB 00:04:58.792 EAL: Trying to obtain current memory policy. 00:04:58.792 EAL: Setting policy MPOL_PREFERRED for socket 0 00:04:58.792 EAL: Restoring previous memory policy: 4 00:04:58.792 EAL: Calling mem event callback 'spdk:(nil)' 00:04:58.792 EAL: request: mp_malloc_sync 00:04:58.792 EAL: No shared files mode enabled, IPC is disabled 00:04:58.792 EAL: Heap on socket 0 was expanded by 1026MB 00:04:59.053 EAL: Calling mem event callback 'spdk:(nil)' 00:04:59.053 passed 00:04:59.053 00:04:59.053 Run Summary: Type Total Ran Passed Failed Inactive 00:04:59.053 suites 1 1 n/a 0 0 00:04:59.053 tests 2 2 2 0 0 00:04:59.053 asserts 5547 5547 5547 0 n/a 00:04:59.053 00:04:59.053 Elapsed time = 1.188 seconds 00:04:59.053 EAL: request: mp_malloc_sync 00:04:59.053 EAL: No shared files mode enabled, IPC is disabled 00:04:59.053 EAL: Heap on socket 0 was shrunk by 1026MB 00:04:59.053 EAL: Calling mem event callback 'spdk:(nil)' 00:04:59.053 EAL: request: mp_malloc_sync 00:04:59.053 EAL: No shared files mode enabled, IPC is disabled 00:04:59.053 EAL: Heap on socket 0 was shrunk by 2MB 00:04:59.053 EAL: No shared files mode enabled, IPC is disabled 00:04:59.053 EAL: No shared files mode enabled, IPC is disabled 00:04:59.053 EAL: No shared files mode enabled, IPC is disabled 00:04:59.053 00:04:59.053 real 0m1.457s 00:04:59.053 user 0m0.597s 00:04:59.053 sys 0m0.708s 00:04:59.053 04:55:19 env.env_vtophys -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:59.053 04:55:19 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:04:59.053 ************************************ 00:04:59.053 END TEST env_vtophys 00:04:59.053 ************************************ 00:04:59.315 04:55:19 env -- env/env.sh@12 -- # run_test env_pci /home/vagrant/spdk_repo/spdk/test/env/pci/pci_ut 00:04:59.315 04:55:19 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:59.315 04:55:19 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:59.315 04:55:19 env -- common/autotest_common.sh@10 -- # set +x 00:04:59.315 ************************************ 00:04:59.315 START TEST env_pci 00:04:59.315 ************************************ 00:04:59.315 04:55:19 env.env_pci -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/env/pci/pci_ut 00:04:59.315 00:04:59.315 00:04:59.315 CUnit - A unit testing framework for C - Version 2.1-3 00:04:59.315 http://cunit.sourceforge.net/ 00:04:59.315 00:04:59.315 00:04:59.315 Suite: pci 00:04:59.315 Test: pci_hook ...[2024-12-15 04:55:19.228762] /home/vagrant/spdk_repo/spdk/lib/env_dpdk/pci.c:1117:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 71079 has claimed it 00:04:59.315 passed 00:04:59.315 00:04:59.315 Run Summary: Type Total Ran Passed Failed Inactive 00:04:59.315 suites 1 1 n/a 0 0 00:04:59.315 tests 1 1 1 0 0 00:04:59.315 asserts 25 25 25 0 n/a 00:04:59.315 00:04:59.315 Elapsed time = 0.006 seconds 00:04:59.315 EAL: Cannot find device (10000:00:01.0) 00:04:59.315 EAL: Failed to attach device on primary process 00:04:59.315 ************************************ 00:04:59.315 END TEST env_pci 00:04:59.315 ************************************ 00:04:59.315 00:04:59.315 real 0m0.063s 00:04:59.315 user 0m0.027s 00:04:59.315 sys 0m0.035s 00:04:59.315 04:55:19 env.env_pci -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:59.315 04:55:19 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:04:59.315 04:55:19 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:04:59.315 04:55:19 env -- env/env.sh@15 -- # uname 00:04:59.315 04:55:19 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:04:59.315 04:55:19 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:04:59.315 04:55:19 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /home/vagrant/spdk_repo/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:59.315 04:55:19 env -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:04:59.315 04:55:19 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:59.315 04:55:19 env -- common/autotest_common.sh@10 -- # set +x 00:04:59.315 ************************************ 00:04:59.315 START TEST env_dpdk_post_init 00:04:59.315 ************************************ 00:04:59.315 04:55:19 env.env_dpdk_post_init -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:04:59.315 EAL: Detected CPU lcores: 10 00:04:59.315 EAL: Detected NUMA nodes: 1 00:04:59.315 EAL: Detected shared linkage of DPDK 00:04:59.315 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:59.315 EAL: Selected IOVA mode 'PA' 00:04:59.576 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:59.576 EAL: Probe PCI driver: spdk_nvme (1b36:0010) device: 0000:00:10.0 (socket -1) 00:04:59.576 EAL: Probe PCI driver: spdk_nvme (1b36:0010) device: 0000:00:11.0 (socket -1) 00:04:59.576 EAL: Probe PCI driver: spdk_nvme (1b36:0010) device: 0000:00:12.0 (socket -1) 00:04:59.576 EAL: Probe PCI driver: spdk_nvme (1b36:0010) device: 0000:00:13.0 (socket -1) 00:04:59.576 Starting DPDK initialization... 00:04:59.576 Starting SPDK post initialization... 00:04:59.576 SPDK NVMe probe 00:04:59.576 Attaching to 0000:00:10.0 00:04:59.576 Attaching to 0000:00:11.0 00:04:59.577 Attaching to 0000:00:12.0 00:04:59.577 Attaching to 0000:00:13.0 00:04:59.577 Attached to 0000:00:13.0 00:04:59.577 Attached to 0000:00:10.0 00:04:59.577 Attached to 0000:00:11.0 00:04:59.577 Attached to 0000:00:12.0 00:04:59.577 Cleaning up... 00:04:59.577 ************************************ 00:04:59.577 END TEST env_dpdk_post_init 00:04:59.577 ************************************ 00:04:59.577 00:04:59.577 real 0m0.260s 00:04:59.577 user 0m0.085s 00:04:59.577 sys 0m0.077s 00:04:59.577 04:55:19 env.env_dpdk_post_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:59.577 04:55:19 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:04:59.577 04:55:19 env -- env/env.sh@26 -- # uname 00:04:59.577 04:55:19 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:04:59.577 04:55:19 env -- env/env.sh@29 -- # run_test env_mem_callbacks /home/vagrant/spdk_repo/spdk/test/env/mem_callbacks/mem_callbacks 00:04:59.577 04:55:19 env -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:59.577 04:55:19 env -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:59.577 04:55:19 env -- common/autotest_common.sh@10 -- # set +x 00:04:59.577 ************************************ 00:04:59.577 START TEST env_mem_callbacks 00:04:59.577 ************************************ 00:04:59.577 04:55:19 env.env_mem_callbacks -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/env/mem_callbacks/mem_callbacks 00:04:59.577 EAL: Detected CPU lcores: 10 00:04:59.577 EAL: Detected NUMA nodes: 1 00:04:59.577 EAL: Detected shared linkage of DPDK 00:04:59.577 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:04:59.577 EAL: Selected IOVA mode 'PA' 00:04:59.838 00:04:59.838 00:04:59.838 CUnit - A unit testing framework for C - Version 2.1-3 00:04:59.838 http://cunit.sourceforge.net/ 00:04:59.838 00:04:59.838 00:04:59.838 Suite: memory 00:04:59.838 Test: test ... 00:04:59.838 register 0x200000200000 2097152 00:04:59.838 malloc 3145728 00:04:59.838 TELEMETRY: No legacy callbacks, legacy socket not created 00:04:59.838 register 0x200000400000 4194304 00:04:59.838 buf 0x200000500000 len 3145728 PASSED 00:04:59.838 malloc 64 00:04:59.838 buf 0x2000004fff40 len 64 PASSED 00:04:59.838 malloc 4194304 00:04:59.838 register 0x200000800000 6291456 00:04:59.838 buf 0x200000a00000 len 4194304 PASSED 00:04:59.838 free 0x200000500000 3145728 00:04:59.838 free 0x2000004fff40 64 00:04:59.838 unregister 0x200000400000 4194304 PASSED 00:04:59.838 free 0x200000a00000 4194304 00:04:59.838 unregister 0x200000800000 6291456 PASSED 00:04:59.838 malloc 8388608 00:04:59.838 register 0x200000400000 10485760 00:04:59.838 buf 0x200000600000 len 8388608 PASSED 00:04:59.838 free 0x200000600000 8388608 00:04:59.838 unregister 0x200000400000 10485760 PASSED 00:04:59.838 passed 00:04:59.838 00:04:59.838 Run Summary: Type Total Ran Passed Failed Inactive 00:04:59.838 suites 1 1 n/a 0 0 00:04:59.838 tests 1 1 1 0 0 00:04:59.838 asserts 15 15 15 0 n/a 00:04:59.838 00:04:59.838 Elapsed time = 0.011 seconds 00:04:59.838 00:04:59.838 real 0m0.176s 00:04:59.838 user 0m0.024s 00:04:59.838 sys 0m0.049s 00:04:59.838 04:55:19 env.env_mem_callbacks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:59.838 04:55:19 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:04:59.838 ************************************ 00:04:59.838 END TEST env_mem_callbacks 00:04:59.838 ************************************ 00:04:59.838 00:04:59.838 real 0m2.721s 00:04:59.838 user 0m1.124s 00:04:59.838 sys 0m1.130s 00:04:59.838 ************************************ 00:04:59.838 END TEST env 00:04:59.838 ************************************ 00:04:59.838 04:55:19 env -- common/autotest_common.sh@1130 -- # xtrace_disable 00:04:59.838 04:55:19 env -- common/autotest_common.sh@10 -- # set +x 00:04:59.838 04:55:19 -- spdk/autotest.sh@156 -- # run_test rpc /home/vagrant/spdk_repo/spdk/test/rpc/rpc.sh 00:04:59.838 04:55:19 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:04:59.838 04:55:19 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:04:59.838 04:55:19 -- common/autotest_common.sh@10 -- # set +x 00:04:59.838 ************************************ 00:04:59.838 START TEST rpc 00:04:59.838 ************************************ 00:04:59.838 04:55:19 rpc -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/rpc/rpc.sh 00:05:00.099 * Looking for test storage... 00:05:00.099 * Found test storage at /home/vagrant/spdk_repo/spdk/test/rpc 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@1711 -- # lcov --version 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:05:00.099 04:55:20 rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:00.099 04:55:20 rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:00.099 04:55:20 rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:00.099 04:55:20 rpc -- scripts/common.sh@336 -- # IFS=.-: 00:05:00.099 04:55:20 rpc -- scripts/common.sh@336 -- # read -ra ver1 00:05:00.099 04:55:20 rpc -- scripts/common.sh@337 -- # IFS=.-: 00:05:00.099 04:55:20 rpc -- scripts/common.sh@337 -- # read -ra ver2 00:05:00.099 04:55:20 rpc -- scripts/common.sh@338 -- # local 'op=<' 00:05:00.099 04:55:20 rpc -- scripts/common.sh@340 -- # ver1_l=2 00:05:00.099 04:55:20 rpc -- scripts/common.sh@341 -- # ver2_l=1 00:05:00.099 04:55:20 rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:00.099 04:55:20 rpc -- scripts/common.sh@344 -- # case "$op" in 00:05:00.099 04:55:20 rpc -- scripts/common.sh@345 -- # : 1 00:05:00.099 04:55:20 rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:00.099 04:55:20 rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:00.099 04:55:20 rpc -- scripts/common.sh@365 -- # decimal 1 00:05:00.099 04:55:20 rpc -- scripts/common.sh@353 -- # local d=1 00:05:00.099 04:55:20 rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:00.099 04:55:20 rpc -- scripts/common.sh@355 -- # echo 1 00:05:00.099 04:55:20 rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:05:00.099 04:55:20 rpc -- scripts/common.sh@366 -- # decimal 2 00:05:00.099 04:55:20 rpc -- scripts/common.sh@353 -- # local d=2 00:05:00.099 04:55:20 rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:00.099 04:55:20 rpc -- scripts/common.sh@355 -- # echo 2 00:05:00.099 04:55:20 rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:05:00.099 04:55:20 rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:00.099 04:55:20 rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:00.099 04:55:20 rpc -- scripts/common.sh@368 -- # return 0 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:05:00.099 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.099 --rc genhtml_branch_coverage=1 00:05:00.099 --rc genhtml_function_coverage=1 00:05:00.099 --rc genhtml_legend=1 00:05:00.099 --rc geninfo_all_blocks=1 00:05:00.099 --rc geninfo_unexecuted_blocks=1 00:05:00.099 00:05:00.099 ' 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:05:00.099 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.099 --rc genhtml_branch_coverage=1 00:05:00.099 --rc genhtml_function_coverage=1 00:05:00.099 --rc genhtml_legend=1 00:05:00.099 --rc geninfo_all_blocks=1 00:05:00.099 --rc geninfo_unexecuted_blocks=1 00:05:00.099 00:05:00.099 ' 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:05:00.099 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.099 --rc genhtml_branch_coverage=1 00:05:00.099 --rc genhtml_function_coverage=1 00:05:00.099 --rc genhtml_legend=1 00:05:00.099 --rc geninfo_all_blocks=1 00:05:00.099 --rc geninfo_unexecuted_blocks=1 00:05:00.099 00:05:00.099 ' 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:05:00.099 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:00.099 --rc genhtml_branch_coverage=1 00:05:00.099 --rc genhtml_function_coverage=1 00:05:00.099 --rc genhtml_legend=1 00:05:00.099 --rc geninfo_all_blocks=1 00:05:00.099 --rc geninfo_unexecuted_blocks=1 00:05:00.099 00:05:00.099 ' 00:05:00.099 04:55:20 rpc -- rpc/rpc.sh@65 -- # spdk_pid=71206 00:05:00.099 04:55:20 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:00.099 04:55:20 rpc -- rpc/rpc.sh@67 -- # waitforlisten 71206 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@835 -- # '[' -z 71206 ']' 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:00.099 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:00.099 04:55:20 rpc -- rpc/rpc.sh@64 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -e bdev 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:00.099 04:55:20 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:00.099 [2024-12-15 04:55:20.196766] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:00.100 [2024-12-15 04:55:20.196918] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid71206 ] 00:05:00.361 [2024-12-15 04:55:20.358142] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:00.361 [2024-12-15 04:55:20.387421] app.c: 612:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:00.361 [2024-12-15 04:55:20.387511] app.c: 613:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 71206' to capture a snapshot of events at runtime. 00:05:00.361 [2024-12-15 04:55:20.387530] app.c: 618:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:00.361 [2024-12-15 04:55:20.387539] app.c: 619:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:00.361 [2024-12-15 04:55:20.387550] app.c: 620:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid71206 for offline analysis/debug. 00:05:00.361 [2024-12-15 04:55:20.387976] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:00.933 04:55:21 rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:00.933 04:55:21 rpc -- common/autotest_common.sh@868 -- # return 0 00:05:00.933 04:55:21 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/test/rpc 00:05:00.933 04:55:21 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/test/rpc 00:05:00.933 04:55:21 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:00.933 04:55:21 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:00.933 04:55:21 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:00.933 04:55:21 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:00.933 04:55:21 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:00.933 ************************************ 00:05:00.933 START TEST rpc_integrity 00:05:00.933 ************************************ 00:05:00.933 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:05:00.933 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:00.933 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:00.933 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:00.933 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:00.933 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:00.933 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:01.194 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:01.194 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:01.194 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.194 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.194 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.194 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:01.194 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:01.194 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.194 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.194 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.194 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:01.194 { 00:05:01.194 "name": "Malloc0", 00:05:01.194 "aliases": [ 00:05:01.194 "7d8e9635-5216-4fc3-9b26-24d53c2dea63" 00:05:01.194 ], 00:05:01.194 "product_name": "Malloc disk", 00:05:01.194 "block_size": 512, 00:05:01.194 "num_blocks": 16384, 00:05:01.194 "uuid": "7d8e9635-5216-4fc3-9b26-24d53c2dea63", 00:05:01.194 "assigned_rate_limits": { 00:05:01.194 "rw_ios_per_sec": 0, 00:05:01.194 "rw_mbytes_per_sec": 0, 00:05:01.194 "r_mbytes_per_sec": 0, 00:05:01.194 "w_mbytes_per_sec": 0 00:05:01.194 }, 00:05:01.194 "claimed": false, 00:05:01.194 "zoned": false, 00:05:01.194 "supported_io_types": { 00:05:01.194 "read": true, 00:05:01.194 "write": true, 00:05:01.194 "unmap": true, 00:05:01.195 "flush": true, 00:05:01.195 "reset": true, 00:05:01.195 "nvme_admin": false, 00:05:01.195 "nvme_io": false, 00:05:01.195 "nvme_io_md": false, 00:05:01.195 "write_zeroes": true, 00:05:01.195 "zcopy": true, 00:05:01.195 "get_zone_info": false, 00:05:01.195 "zone_management": false, 00:05:01.195 "zone_append": false, 00:05:01.195 "compare": false, 00:05:01.195 "compare_and_write": false, 00:05:01.195 "abort": true, 00:05:01.195 "seek_hole": false, 00:05:01.195 "seek_data": false, 00:05:01.195 "copy": true, 00:05:01.195 "nvme_iov_md": false 00:05:01.195 }, 00:05:01.195 "memory_domains": [ 00:05:01.195 { 00:05:01.195 "dma_device_id": "system", 00:05:01.195 "dma_device_type": 1 00:05:01.195 }, 00:05:01.195 { 00:05:01.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:01.195 "dma_device_type": 2 00:05:01.195 } 00:05:01.195 ], 00:05:01.195 "driver_specific": {} 00:05:01.195 } 00:05:01.195 ]' 00:05:01.195 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:01.195 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:01.195 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.195 [2024-12-15 04:55:21.166691] vbdev_passthru.c: 608:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:01.195 [2024-12-15 04:55:21.166767] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:01.195 [2024-12-15 04:55:21.166802] vbdev_passthru.c: 682:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000008480 00:05:01.195 [2024-12-15 04:55:21.166813] vbdev_passthru.c: 697:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:01.195 [2024-12-15 04:55:21.169478] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:01.195 [2024-12-15 04:55:21.169535] vbdev_passthru.c: 711:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:01.195 Passthru0 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.195 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.195 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:01.195 { 00:05:01.195 "name": "Malloc0", 00:05:01.195 "aliases": [ 00:05:01.195 "7d8e9635-5216-4fc3-9b26-24d53c2dea63" 00:05:01.195 ], 00:05:01.195 "product_name": "Malloc disk", 00:05:01.195 "block_size": 512, 00:05:01.195 "num_blocks": 16384, 00:05:01.195 "uuid": "7d8e9635-5216-4fc3-9b26-24d53c2dea63", 00:05:01.195 "assigned_rate_limits": { 00:05:01.195 "rw_ios_per_sec": 0, 00:05:01.195 "rw_mbytes_per_sec": 0, 00:05:01.195 "r_mbytes_per_sec": 0, 00:05:01.195 "w_mbytes_per_sec": 0 00:05:01.195 }, 00:05:01.195 "claimed": true, 00:05:01.195 "claim_type": "exclusive_write", 00:05:01.195 "zoned": false, 00:05:01.195 "supported_io_types": { 00:05:01.195 "read": true, 00:05:01.195 "write": true, 00:05:01.195 "unmap": true, 00:05:01.195 "flush": true, 00:05:01.195 "reset": true, 00:05:01.195 "nvme_admin": false, 00:05:01.195 "nvme_io": false, 00:05:01.195 "nvme_io_md": false, 00:05:01.195 "write_zeroes": true, 00:05:01.195 "zcopy": true, 00:05:01.195 "get_zone_info": false, 00:05:01.195 "zone_management": false, 00:05:01.195 "zone_append": false, 00:05:01.195 "compare": false, 00:05:01.195 "compare_and_write": false, 00:05:01.195 "abort": true, 00:05:01.195 "seek_hole": false, 00:05:01.195 "seek_data": false, 00:05:01.195 "copy": true, 00:05:01.195 "nvme_iov_md": false 00:05:01.195 }, 00:05:01.195 "memory_domains": [ 00:05:01.195 { 00:05:01.195 "dma_device_id": "system", 00:05:01.195 "dma_device_type": 1 00:05:01.195 }, 00:05:01.195 { 00:05:01.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:01.195 "dma_device_type": 2 00:05:01.195 } 00:05:01.195 ], 00:05:01.195 "driver_specific": {} 00:05:01.195 }, 00:05:01.195 { 00:05:01.195 "name": "Passthru0", 00:05:01.195 "aliases": [ 00:05:01.195 "486b2d86-d9f3-5f4f-9e03-24a1becbdea9" 00:05:01.195 ], 00:05:01.195 "product_name": "passthru", 00:05:01.195 "block_size": 512, 00:05:01.195 "num_blocks": 16384, 00:05:01.195 "uuid": "486b2d86-d9f3-5f4f-9e03-24a1becbdea9", 00:05:01.195 "assigned_rate_limits": { 00:05:01.195 "rw_ios_per_sec": 0, 00:05:01.195 "rw_mbytes_per_sec": 0, 00:05:01.195 "r_mbytes_per_sec": 0, 00:05:01.195 "w_mbytes_per_sec": 0 00:05:01.195 }, 00:05:01.195 "claimed": false, 00:05:01.195 "zoned": false, 00:05:01.195 "supported_io_types": { 00:05:01.195 "read": true, 00:05:01.195 "write": true, 00:05:01.195 "unmap": true, 00:05:01.195 "flush": true, 00:05:01.195 "reset": true, 00:05:01.195 "nvme_admin": false, 00:05:01.195 "nvme_io": false, 00:05:01.195 "nvme_io_md": false, 00:05:01.195 "write_zeroes": true, 00:05:01.195 "zcopy": true, 00:05:01.195 "get_zone_info": false, 00:05:01.195 "zone_management": false, 00:05:01.195 "zone_append": false, 00:05:01.195 "compare": false, 00:05:01.195 "compare_and_write": false, 00:05:01.195 "abort": true, 00:05:01.195 "seek_hole": false, 00:05:01.195 "seek_data": false, 00:05:01.195 "copy": true, 00:05:01.195 "nvme_iov_md": false 00:05:01.195 }, 00:05:01.195 "memory_domains": [ 00:05:01.195 { 00:05:01.195 "dma_device_id": "system", 00:05:01.195 "dma_device_type": 1 00:05:01.195 }, 00:05:01.195 { 00:05:01.195 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:01.195 "dma_device_type": 2 00:05:01.195 } 00:05:01.195 ], 00:05:01.195 "driver_specific": { 00:05:01.195 "passthru": { 00:05:01.195 "name": "Passthru0", 00:05:01.195 "base_bdev_name": "Malloc0" 00:05:01.195 } 00:05:01.195 } 00:05:01.195 } 00:05:01.195 ]' 00:05:01.195 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:01.195 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:01.195 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.195 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.195 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.195 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:01.195 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:01.195 04:55:21 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:01.195 00:05:01.195 real 0m0.225s 00:05:01.195 user 0m0.125s 00:05:01.195 sys 0m0.038s 00:05:01.195 ************************************ 00:05:01.195 END TEST rpc_integrity 00:05:01.195 ************************************ 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:01.195 04:55:21 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.456 04:55:21 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:01.456 04:55:21 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:01.456 04:55:21 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:01.456 04:55:21 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:01.456 ************************************ 00:05:01.456 START TEST rpc_plugins 00:05:01.456 ************************************ 00:05:01.456 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@1129 -- # rpc_plugins 00:05:01.456 04:55:21 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:01.456 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.456 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:01.456 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.456 04:55:21 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:01.456 04:55:21 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:01.456 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.456 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:01.456 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.456 04:55:21 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:01.456 { 00:05:01.456 "name": "Malloc1", 00:05:01.456 "aliases": [ 00:05:01.456 "5e1d7ef3-fc01-46ce-9b9e-231ce2b8e93f" 00:05:01.456 ], 00:05:01.456 "product_name": "Malloc disk", 00:05:01.456 "block_size": 4096, 00:05:01.456 "num_blocks": 256, 00:05:01.456 "uuid": "5e1d7ef3-fc01-46ce-9b9e-231ce2b8e93f", 00:05:01.456 "assigned_rate_limits": { 00:05:01.456 "rw_ios_per_sec": 0, 00:05:01.456 "rw_mbytes_per_sec": 0, 00:05:01.456 "r_mbytes_per_sec": 0, 00:05:01.456 "w_mbytes_per_sec": 0 00:05:01.456 }, 00:05:01.456 "claimed": false, 00:05:01.456 "zoned": false, 00:05:01.456 "supported_io_types": { 00:05:01.456 "read": true, 00:05:01.456 "write": true, 00:05:01.456 "unmap": true, 00:05:01.456 "flush": true, 00:05:01.456 "reset": true, 00:05:01.456 "nvme_admin": false, 00:05:01.456 "nvme_io": false, 00:05:01.456 "nvme_io_md": false, 00:05:01.456 "write_zeroes": true, 00:05:01.456 "zcopy": true, 00:05:01.456 "get_zone_info": false, 00:05:01.456 "zone_management": false, 00:05:01.456 "zone_append": false, 00:05:01.456 "compare": false, 00:05:01.456 "compare_and_write": false, 00:05:01.456 "abort": true, 00:05:01.456 "seek_hole": false, 00:05:01.456 "seek_data": false, 00:05:01.456 "copy": true, 00:05:01.456 "nvme_iov_md": false 00:05:01.456 }, 00:05:01.456 "memory_domains": [ 00:05:01.456 { 00:05:01.456 "dma_device_id": "system", 00:05:01.456 "dma_device_type": 1 00:05:01.456 }, 00:05:01.456 { 00:05:01.456 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:01.456 "dma_device_type": 2 00:05:01.456 } 00:05:01.456 ], 00:05:01.456 "driver_specific": {} 00:05:01.456 } 00:05:01.456 ]' 00:05:01.456 04:55:21 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:05:01.456 04:55:21 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:01.456 04:55:21 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:01.456 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.456 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:01.456 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.456 04:55:21 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:01.456 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.456 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:01.456 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.456 04:55:21 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:01.456 04:55:21 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:05:01.456 04:55:21 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:01.456 00:05:01.456 real 0m0.119s 00:05:01.456 user 0m0.066s 00:05:01.457 sys 0m0.020s 00:05:01.457 ************************************ 00:05:01.457 END TEST rpc_plugins 00:05:01.457 ************************************ 00:05:01.457 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:01.457 04:55:21 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:01.457 04:55:21 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:01.457 04:55:21 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:01.457 04:55:21 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:01.457 04:55:21 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:01.457 ************************************ 00:05:01.457 START TEST rpc_trace_cmd_test 00:05:01.457 ************************************ 00:05:01.457 04:55:21 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1129 -- # rpc_trace_cmd_test 00:05:01.457 04:55:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:05:01.457 04:55:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:01.457 04:55:21 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.457 04:55:21 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:01.457 04:55:21 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.457 04:55:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:05:01.457 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid71206", 00:05:01.457 "tpoint_group_mask": "0x8", 00:05:01.457 "iscsi_conn": { 00:05:01.457 "mask": "0x2", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "scsi": { 00:05:01.457 "mask": "0x4", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "bdev": { 00:05:01.457 "mask": "0x8", 00:05:01.457 "tpoint_mask": "0xffffffffffffffff" 00:05:01.457 }, 00:05:01.457 "nvmf_rdma": { 00:05:01.457 "mask": "0x10", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "nvmf_tcp": { 00:05:01.457 "mask": "0x20", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "ftl": { 00:05:01.457 "mask": "0x40", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "blobfs": { 00:05:01.457 "mask": "0x80", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "dsa": { 00:05:01.457 "mask": "0x200", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "thread": { 00:05:01.457 "mask": "0x400", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "nvme_pcie": { 00:05:01.457 "mask": "0x800", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "iaa": { 00:05:01.457 "mask": "0x1000", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "nvme_tcp": { 00:05:01.457 "mask": "0x2000", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "bdev_nvme": { 00:05:01.457 "mask": "0x4000", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "sock": { 00:05:01.457 "mask": "0x8000", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "blob": { 00:05:01.457 "mask": "0x10000", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "bdev_raid": { 00:05:01.457 "mask": "0x20000", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 }, 00:05:01.457 "scheduler": { 00:05:01.457 "mask": "0x40000", 00:05:01.457 "tpoint_mask": "0x0" 00:05:01.457 } 00:05:01.457 }' 00:05:01.457 04:55:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:05:01.457 04:55:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 19 -gt 2 ']' 00:05:01.457 04:55:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:01.717 04:55:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:01.717 04:55:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:01.717 04:55:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:01.717 04:55:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:01.717 04:55:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:01.717 04:55:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:01.717 04:55:21 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:01.717 00:05:01.717 real 0m0.168s 00:05:01.717 user 0m0.132s 00:05:01.717 sys 0m0.023s 00:05:01.717 ************************************ 00:05:01.717 END TEST rpc_trace_cmd_test 00:05:01.717 ************************************ 00:05:01.717 04:55:21 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:01.717 04:55:21 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:01.717 04:55:21 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:01.717 04:55:21 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:01.717 04:55:21 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:01.717 04:55:21 rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:01.717 04:55:21 rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:01.717 04:55:21 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:01.717 ************************************ 00:05:01.717 START TEST rpc_daemon_integrity 00:05:01.717 ************************************ 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1129 -- # rpc_integrity 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.717 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:01.717 { 00:05:01.717 "name": "Malloc2", 00:05:01.717 "aliases": [ 00:05:01.717 "3172b947-0aa5-4fcd-a1b0-e87da8489f23" 00:05:01.717 ], 00:05:01.717 "product_name": "Malloc disk", 00:05:01.717 "block_size": 512, 00:05:01.717 "num_blocks": 16384, 00:05:01.717 "uuid": "3172b947-0aa5-4fcd-a1b0-e87da8489f23", 00:05:01.717 "assigned_rate_limits": { 00:05:01.717 "rw_ios_per_sec": 0, 00:05:01.717 "rw_mbytes_per_sec": 0, 00:05:01.717 "r_mbytes_per_sec": 0, 00:05:01.717 "w_mbytes_per_sec": 0 00:05:01.717 }, 00:05:01.717 "claimed": false, 00:05:01.717 "zoned": false, 00:05:01.717 "supported_io_types": { 00:05:01.717 "read": true, 00:05:01.717 "write": true, 00:05:01.717 "unmap": true, 00:05:01.717 "flush": true, 00:05:01.717 "reset": true, 00:05:01.717 "nvme_admin": false, 00:05:01.717 "nvme_io": false, 00:05:01.717 "nvme_io_md": false, 00:05:01.717 "write_zeroes": true, 00:05:01.717 "zcopy": true, 00:05:01.717 "get_zone_info": false, 00:05:01.717 "zone_management": false, 00:05:01.717 "zone_append": false, 00:05:01.717 "compare": false, 00:05:01.717 "compare_and_write": false, 00:05:01.717 "abort": true, 00:05:01.718 "seek_hole": false, 00:05:01.718 "seek_data": false, 00:05:01.718 "copy": true, 00:05:01.718 "nvme_iov_md": false 00:05:01.718 }, 00:05:01.718 "memory_domains": [ 00:05:01.718 { 00:05:01.718 "dma_device_id": "system", 00:05:01.718 "dma_device_type": 1 00:05:01.718 }, 00:05:01.718 { 00:05:01.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:01.718 "dma_device_type": 2 00:05:01.718 } 00:05:01.718 ], 00:05:01.718 "driver_specific": {} 00:05:01.718 } 00:05:01.718 ]' 00:05:01.718 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:01.979 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:01.979 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:01.979 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.979 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.979 [2024-12-15 04:55:21.880778] vbdev_passthru.c: 608:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:01.979 [2024-12-15 04:55:21.880853] vbdev_passthru.c: 636:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:01.979 [2024-12-15 04:55:21.880887] vbdev_passthru.c: 682:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x616000009680 00:05:01.979 [2024-12-15 04:55:21.880898] vbdev_passthru.c: 697:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:01.979 [2024-12-15 04:55:21.883390] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:01.979 [2024-12-15 04:55:21.883461] vbdev_passthru.c: 711:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:01.979 Passthru0 00:05:01.979 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.979 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:01.979 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.979 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.979 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.979 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:01.979 { 00:05:01.979 "name": "Malloc2", 00:05:01.979 "aliases": [ 00:05:01.979 "3172b947-0aa5-4fcd-a1b0-e87da8489f23" 00:05:01.980 ], 00:05:01.980 "product_name": "Malloc disk", 00:05:01.980 "block_size": 512, 00:05:01.980 "num_blocks": 16384, 00:05:01.980 "uuid": "3172b947-0aa5-4fcd-a1b0-e87da8489f23", 00:05:01.980 "assigned_rate_limits": { 00:05:01.980 "rw_ios_per_sec": 0, 00:05:01.980 "rw_mbytes_per_sec": 0, 00:05:01.980 "r_mbytes_per_sec": 0, 00:05:01.980 "w_mbytes_per_sec": 0 00:05:01.980 }, 00:05:01.980 "claimed": true, 00:05:01.980 "claim_type": "exclusive_write", 00:05:01.980 "zoned": false, 00:05:01.980 "supported_io_types": { 00:05:01.980 "read": true, 00:05:01.980 "write": true, 00:05:01.980 "unmap": true, 00:05:01.980 "flush": true, 00:05:01.980 "reset": true, 00:05:01.980 "nvme_admin": false, 00:05:01.980 "nvme_io": false, 00:05:01.980 "nvme_io_md": false, 00:05:01.980 "write_zeroes": true, 00:05:01.980 "zcopy": true, 00:05:01.980 "get_zone_info": false, 00:05:01.980 "zone_management": false, 00:05:01.980 "zone_append": false, 00:05:01.980 "compare": false, 00:05:01.980 "compare_and_write": false, 00:05:01.980 "abort": true, 00:05:01.980 "seek_hole": false, 00:05:01.980 "seek_data": false, 00:05:01.980 "copy": true, 00:05:01.980 "nvme_iov_md": false 00:05:01.980 }, 00:05:01.980 "memory_domains": [ 00:05:01.980 { 00:05:01.980 "dma_device_id": "system", 00:05:01.980 "dma_device_type": 1 00:05:01.980 }, 00:05:01.980 { 00:05:01.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:01.980 "dma_device_type": 2 00:05:01.980 } 00:05:01.980 ], 00:05:01.980 "driver_specific": {} 00:05:01.980 }, 00:05:01.980 { 00:05:01.980 "name": "Passthru0", 00:05:01.980 "aliases": [ 00:05:01.980 "f6412f8d-df62-5d7a-852d-35d4f7fc92c0" 00:05:01.980 ], 00:05:01.980 "product_name": "passthru", 00:05:01.980 "block_size": 512, 00:05:01.980 "num_blocks": 16384, 00:05:01.980 "uuid": "f6412f8d-df62-5d7a-852d-35d4f7fc92c0", 00:05:01.980 "assigned_rate_limits": { 00:05:01.980 "rw_ios_per_sec": 0, 00:05:01.980 "rw_mbytes_per_sec": 0, 00:05:01.980 "r_mbytes_per_sec": 0, 00:05:01.980 "w_mbytes_per_sec": 0 00:05:01.980 }, 00:05:01.980 "claimed": false, 00:05:01.980 "zoned": false, 00:05:01.980 "supported_io_types": { 00:05:01.980 "read": true, 00:05:01.980 "write": true, 00:05:01.980 "unmap": true, 00:05:01.980 "flush": true, 00:05:01.980 "reset": true, 00:05:01.980 "nvme_admin": false, 00:05:01.980 "nvme_io": false, 00:05:01.980 "nvme_io_md": false, 00:05:01.980 "write_zeroes": true, 00:05:01.980 "zcopy": true, 00:05:01.980 "get_zone_info": false, 00:05:01.980 "zone_management": false, 00:05:01.980 "zone_append": false, 00:05:01.980 "compare": false, 00:05:01.980 "compare_and_write": false, 00:05:01.980 "abort": true, 00:05:01.980 "seek_hole": false, 00:05:01.980 "seek_data": false, 00:05:01.980 "copy": true, 00:05:01.980 "nvme_iov_md": false 00:05:01.980 }, 00:05:01.980 "memory_domains": [ 00:05:01.980 { 00:05:01.980 "dma_device_id": "system", 00:05:01.980 "dma_device_type": 1 00:05:01.980 }, 00:05:01.980 { 00:05:01.980 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:01.980 "dma_device_type": 2 00:05:01.980 } 00:05:01.980 ], 00:05:01.980 "driver_specific": { 00:05:01.980 "passthru": { 00:05:01.980 "name": "Passthru0", 00:05:01.980 "base_bdev_name": "Malloc2" 00:05:01.980 } 00:05:01.980 } 00:05:01.980 } 00:05:01.980 ]' 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:01.980 00:05:01.980 real 0m0.220s 00:05:01.980 user 0m0.118s 00:05:01.980 sys 0m0.040s 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:01.980 ************************************ 00:05:01.980 END TEST rpc_daemon_integrity 00:05:01.980 ************************************ 00:05:01.980 04:55:21 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:01.980 04:55:22 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:01.980 04:55:22 rpc -- rpc/rpc.sh@84 -- # killprocess 71206 00:05:01.980 04:55:22 rpc -- common/autotest_common.sh@954 -- # '[' -z 71206 ']' 00:05:01.980 04:55:22 rpc -- common/autotest_common.sh@958 -- # kill -0 71206 00:05:01.980 04:55:22 rpc -- common/autotest_common.sh@959 -- # uname 00:05:01.980 04:55:22 rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:01.980 04:55:22 rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 71206 00:05:01.980 04:55:22 rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:01.980 04:55:22 rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:01.980 killing process with pid 71206 00:05:01.980 04:55:22 rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 71206' 00:05:01.980 04:55:22 rpc -- common/autotest_common.sh@973 -- # kill 71206 00:05:01.980 04:55:22 rpc -- common/autotest_common.sh@978 -- # wait 71206 00:05:02.554 00:05:02.554 real 0m2.428s 00:05:02.554 user 0m2.801s 00:05:02.554 sys 0m0.701s 00:05:02.554 04:55:22 rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:02.554 04:55:22 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:02.554 ************************************ 00:05:02.554 END TEST rpc 00:05:02.554 ************************************ 00:05:02.554 04:55:22 -- spdk/autotest.sh@157 -- # run_test skip_rpc /home/vagrant/spdk_repo/spdk/test/rpc/skip_rpc.sh 00:05:02.554 04:55:22 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:02.554 04:55:22 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:02.554 04:55:22 -- common/autotest_common.sh@10 -- # set +x 00:05:02.555 ************************************ 00:05:02.555 START TEST skip_rpc 00:05:02.555 ************************************ 00:05:02.555 04:55:22 skip_rpc -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/rpc/skip_rpc.sh 00:05:02.555 * Looking for test storage... 00:05:02.555 * Found test storage at /home/vagrant/spdk_repo/spdk/test/rpc 00:05:02.555 04:55:22 skip_rpc -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:05:02.555 04:55:22 skip_rpc -- common/autotest_common.sh@1711 -- # lcov --version 00:05:02.555 04:55:22 skip_rpc -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:05:02.555 04:55:22 skip_rpc -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@344 -- # case "$op" in 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@345 -- # : 1 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@365 -- # decimal 1 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@353 -- # local d=1 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@355 -- # echo 1 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@366 -- # decimal 2 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@353 -- # local d=2 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@355 -- # echo 2 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:02.555 04:55:22 skip_rpc -- scripts/common.sh@368 -- # return 0 00:05:02.555 04:55:22 skip_rpc -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:02.555 04:55:22 skip_rpc -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:05:02.555 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:02.555 --rc genhtml_branch_coverage=1 00:05:02.555 --rc genhtml_function_coverage=1 00:05:02.555 --rc genhtml_legend=1 00:05:02.555 --rc geninfo_all_blocks=1 00:05:02.555 --rc geninfo_unexecuted_blocks=1 00:05:02.555 00:05:02.555 ' 00:05:02.555 04:55:22 skip_rpc -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:05:02.555 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:02.555 --rc genhtml_branch_coverage=1 00:05:02.555 --rc genhtml_function_coverage=1 00:05:02.555 --rc genhtml_legend=1 00:05:02.555 --rc geninfo_all_blocks=1 00:05:02.555 --rc geninfo_unexecuted_blocks=1 00:05:02.555 00:05:02.555 ' 00:05:02.555 04:55:22 skip_rpc -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:05:02.555 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:02.555 --rc genhtml_branch_coverage=1 00:05:02.555 --rc genhtml_function_coverage=1 00:05:02.555 --rc genhtml_legend=1 00:05:02.555 --rc geninfo_all_blocks=1 00:05:02.555 --rc geninfo_unexecuted_blocks=1 00:05:02.555 00:05:02.555 ' 00:05:02.555 04:55:22 skip_rpc -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:05:02.555 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:02.555 --rc genhtml_branch_coverage=1 00:05:02.555 --rc genhtml_function_coverage=1 00:05:02.555 --rc genhtml_legend=1 00:05:02.555 --rc geninfo_all_blocks=1 00:05:02.555 --rc geninfo_unexecuted_blocks=1 00:05:02.555 00:05:02.555 ' 00:05:02.555 04:55:22 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/home/vagrant/spdk_repo/spdk/test/rpc/config.json 00:05:02.555 04:55:22 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/home/vagrant/spdk_repo/spdk/test/rpc/log.txt 00:05:02.555 04:55:22 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:02.555 04:55:22 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:02.555 04:55:22 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:02.555 04:55:22 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:02.555 ************************************ 00:05:02.555 START TEST skip_rpc 00:05:02.555 ************************************ 00:05:02.555 04:55:22 skip_rpc.skip_rpc -- common/autotest_common.sh@1129 -- # test_skip_rpc 00:05:02.555 04:55:22 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=71408 00:05:02.555 04:55:22 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:02.555 04:55:22 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:02.555 04:55:22 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:02.816 [2024-12-15 04:55:22.707705] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:02.816 [2024-12-15 04:55:22.707863] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid71408 ] 00:05:02.816 [2024-12-15 04:55:22.870787] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:02.816 [2024-12-15 04:55:22.898967] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # local es=0 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd spdk_get_version 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # rpc_cmd spdk_get_version 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@655 -- # es=1 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 71408 00:05:08.106 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # '[' -z 71408 ']' 00:05:08.107 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # kill -0 71408 00:05:08.107 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # uname 00:05:08.107 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:08.107 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 71408 00:05:08.107 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:08.107 killing process with pid 71408 00:05:08.107 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:08.107 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 71408' 00:05:08.107 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@973 -- # kill 71408 00:05:08.107 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@978 -- # wait 71408 00:05:08.107 00:05:08.107 real 0m5.262s 00:05:08.107 user 0m4.849s 00:05:08.107 sys 0m0.309s 00:05:08.107 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:08.107 ************************************ 00:05:08.107 END TEST skip_rpc 00:05:08.107 ************************************ 00:05:08.107 04:55:27 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:08.107 04:55:27 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:05:08.107 04:55:27 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:08.107 04:55:27 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:08.107 04:55:27 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:08.107 ************************************ 00:05:08.107 START TEST skip_rpc_with_json 00:05:08.107 ************************************ 00:05:08.107 04:55:27 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_json 00:05:08.107 04:55:27 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:05:08.107 04:55:27 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=71495 00:05:08.107 04:55:27 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:08.107 04:55:27 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 71495 00:05:08.107 04:55:27 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # '[' -z 71495 ']' 00:05:08.107 04:55:27 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:08.107 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:08.107 04:55:27 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:08.107 04:55:27 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:08.107 04:55:27 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 00:05:08.107 04:55:27 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:08.107 04:55:27 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:08.107 [2024-12-15 04:55:28.016314] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:08.107 [2024-12-15 04:55:28.016454] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid71495 ] 00:05:08.107 [2024-12-15 04:55:28.173277] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:08.107 [2024-12-15 04:55:28.194865] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:09.050 04:55:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:09.050 04:55:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@868 -- # return 0 00:05:09.050 04:55:28 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:05:09.050 04:55:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:09.050 04:55:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:09.050 [2024-12-15 04:55:28.861958] nvmf_rpc.c:2707:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:05:09.050 request: 00:05:09.050 { 00:05:09.050 "trtype": "tcp", 00:05:09.050 "method": "nvmf_get_transports", 00:05:09.050 "req_id": 1 00:05:09.050 } 00:05:09.050 Got JSON-RPC error response 00:05:09.050 response: 00:05:09.050 { 00:05:09.050 "code": -19, 00:05:09.050 "message": "No such device" 00:05:09.050 } 00:05:09.050 04:55:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:05:09.050 04:55:28 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:05:09.050 04:55:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:09.050 04:55:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:09.050 [2024-12-15 04:55:28.870045] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:05:09.050 04:55:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:09.050 04:55:28 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:05:09.050 04:55:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:09.050 04:55:28 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:09.050 04:55:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:09.050 04:55:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /home/vagrant/spdk_repo/spdk/test/rpc/config.json 00:05:09.050 { 00:05:09.050 "subsystems": [ 00:05:09.050 { 00:05:09.050 "subsystem": "fsdev", 00:05:09.050 "config": [ 00:05:09.050 { 00:05:09.050 "method": "fsdev_set_opts", 00:05:09.050 "params": { 00:05:09.050 "fsdev_io_pool_size": 65535, 00:05:09.050 "fsdev_io_cache_size": 256 00:05:09.050 } 00:05:09.050 } 00:05:09.050 ] 00:05:09.050 }, 00:05:09.050 { 00:05:09.050 "subsystem": "keyring", 00:05:09.050 "config": [] 00:05:09.050 }, 00:05:09.050 { 00:05:09.050 "subsystem": "iobuf", 00:05:09.050 "config": [ 00:05:09.050 { 00:05:09.050 "method": "iobuf_set_options", 00:05:09.050 "params": { 00:05:09.050 "small_pool_count": 8192, 00:05:09.050 "large_pool_count": 1024, 00:05:09.050 "small_bufsize": 8192, 00:05:09.050 "large_bufsize": 135168, 00:05:09.050 "enable_numa": false 00:05:09.050 } 00:05:09.050 } 00:05:09.050 ] 00:05:09.050 }, 00:05:09.050 { 00:05:09.050 "subsystem": "sock", 00:05:09.050 "config": [ 00:05:09.050 { 00:05:09.050 "method": "sock_set_default_impl", 00:05:09.050 "params": { 00:05:09.050 "impl_name": "posix" 00:05:09.050 } 00:05:09.050 }, 00:05:09.050 { 00:05:09.050 "method": "sock_impl_set_options", 00:05:09.050 "params": { 00:05:09.050 "impl_name": "ssl", 00:05:09.050 "recv_buf_size": 4096, 00:05:09.050 "send_buf_size": 4096, 00:05:09.050 "enable_recv_pipe": true, 00:05:09.050 "enable_quickack": false, 00:05:09.050 "enable_placement_id": 0, 00:05:09.050 "enable_zerocopy_send_server": true, 00:05:09.050 "enable_zerocopy_send_client": false, 00:05:09.050 "zerocopy_threshold": 0, 00:05:09.050 "tls_version": 0, 00:05:09.050 "enable_ktls": false 00:05:09.050 } 00:05:09.050 }, 00:05:09.050 { 00:05:09.050 "method": "sock_impl_set_options", 00:05:09.050 "params": { 00:05:09.050 "impl_name": "posix", 00:05:09.050 "recv_buf_size": 2097152, 00:05:09.050 "send_buf_size": 2097152, 00:05:09.050 "enable_recv_pipe": true, 00:05:09.050 "enable_quickack": false, 00:05:09.050 "enable_placement_id": 0, 00:05:09.050 "enable_zerocopy_send_server": true, 00:05:09.050 "enable_zerocopy_send_client": false, 00:05:09.050 "zerocopy_threshold": 0, 00:05:09.050 "tls_version": 0, 00:05:09.050 "enable_ktls": false 00:05:09.050 } 00:05:09.050 } 00:05:09.050 ] 00:05:09.050 }, 00:05:09.050 { 00:05:09.050 "subsystem": "vmd", 00:05:09.050 "config": [] 00:05:09.050 }, 00:05:09.050 { 00:05:09.050 "subsystem": "accel", 00:05:09.050 "config": [ 00:05:09.050 { 00:05:09.050 "method": "accel_set_options", 00:05:09.050 "params": { 00:05:09.050 "small_cache_size": 128, 00:05:09.050 "large_cache_size": 16, 00:05:09.050 "task_count": 2048, 00:05:09.050 "sequence_count": 2048, 00:05:09.050 "buf_count": 2048 00:05:09.050 } 00:05:09.050 } 00:05:09.050 ] 00:05:09.050 }, 00:05:09.050 { 00:05:09.050 "subsystem": "bdev", 00:05:09.050 "config": [ 00:05:09.050 { 00:05:09.050 "method": "bdev_set_options", 00:05:09.050 "params": { 00:05:09.050 "bdev_io_pool_size": 65535, 00:05:09.050 "bdev_io_cache_size": 256, 00:05:09.050 "bdev_auto_examine": true, 00:05:09.050 "iobuf_small_cache_size": 128, 00:05:09.050 "iobuf_large_cache_size": 16 00:05:09.050 } 00:05:09.050 }, 00:05:09.050 { 00:05:09.050 "method": "bdev_raid_set_options", 00:05:09.050 "params": { 00:05:09.050 "process_window_size_kb": 1024, 00:05:09.050 "process_max_bandwidth_mb_sec": 0 00:05:09.050 } 00:05:09.050 }, 00:05:09.050 { 00:05:09.050 "method": "bdev_iscsi_set_options", 00:05:09.050 "params": { 00:05:09.050 "timeout_sec": 30 00:05:09.050 } 00:05:09.050 }, 00:05:09.050 { 00:05:09.050 "method": "bdev_nvme_set_options", 00:05:09.050 "params": { 00:05:09.050 "action_on_timeout": "none", 00:05:09.050 "timeout_us": 0, 00:05:09.050 "timeout_admin_us": 0, 00:05:09.050 "keep_alive_timeout_ms": 10000, 00:05:09.050 "arbitration_burst": 0, 00:05:09.050 "low_priority_weight": 0, 00:05:09.050 "medium_priority_weight": 0, 00:05:09.050 "high_priority_weight": 0, 00:05:09.050 "nvme_adminq_poll_period_us": 10000, 00:05:09.050 "nvme_ioq_poll_period_us": 0, 00:05:09.050 "io_queue_requests": 0, 00:05:09.050 "delay_cmd_submit": true, 00:05:09.050 "transport_retry_count": 4, 00:05:09.050 "bdev_retry_count": 3, 00:05:09.050 "transport_ack_timeout": 0, 00:05:09.050 "ctrlr_loss_timeout_sec": 0, 00:05:09.050 "reconnect_delay_sec": 0, 00:05:09.050 "fast_io_fail_timeout_sec": 0, 00:05:09.050 "disable_auto_failback": false, 00:05:09.050 "generate_uuids": false, 00:05:09.050 "transport_tos": 0, 00:05:09.050 "nvme_error_stat": false, 00:05:09.050 "rdma_srq_size": 0, 00:05:09.050 "io_path_stat": false, 00:05:09.050 "allow_accel_sequence": false, 00:05:09.050 "rdma_max_cq_size": 0, 00:05:09.050 "rdma_cm_event_timeout_ms": 0, 00:05:09.050 "dhchap_digests": [ 00:05:09.050 "sha256", 00:05:09.050 "sha384", 00:05:09.050 "sha512" 00:05:09.050 ], 00:05:09.050 "dhchap_dhgroups": [ 00:05:09.050 "null", 00:05:09.050 "ffdhe2048", 00:05:09.050 "ffdhe3072", 00:05:09.050 "ffdhe4096", 00:05:09.050 "ffdhe6144", 00:05:09.050 "ffdhe8192" 00:05:09.050 ], 00:05:09.050 "rdma_umr_per_io": false 00:05:09.050 } 00:05:09.050 }, 00:05:09.050 { 00:05:09.050 "method": "bdev_nvme_set_hotplug", 00:05:09.050 "params": { 00:05:09.050 "period_us": 100000, 00:05:09.050 "enable": false 00:05:09.050 } 00:05:09.050 }, 00:05:09.050 { 00:05:09.050 "method": "bdev_wait_for_examine" 00:05:09.050 } 00:05:09.050 ] 00:05:09.050 }, 00:05:09.050 { 00:05:09.051 "subsystem": "scsi", 00:05:09.051 "config": null 00:05:09.051 }, 00:05:09.051 { 00:05:09.051 "subsystem": "scheduler", 00:05:09.051 "config": [ 00:05:09.051 { 00:05:09.051 "method": "framework_set_scheduler", 00:05:09.051 "params": { 00:05:09.051 "name": "static" 00:05:09.051 } 00:05:09.051 } 00:05:09.051 ] 00:05:09.051 }, 00:05:09.051 { 00:05:09.051 "subsystem": "vhost_scsi", 00:05:09.051 "config": [] 00:05:09.051 }, 00:05:09.051 { 00:05:09.051 "subsystem": "vhost_blk", 00:05:09.051 "config": [] 00:05:09.051 }, 00:05:09.051 { 00:05:09.051 "subsystem": "ublk", 00:05:09.051 "config": [] 00:05:09.051 }, 00:05:09.051 { 00:05:09.051 "subsystem": "nbd", 00:05:09.051 "config": [] 00:05:09.051 }, 00:05:09.051 { 00:05:09.051 "subsystem": "nvmf", 00:05:09.051 "config": [ 00:05:09.051 { 00:05:09.051 "method": "nvmf_set_config", 00:05:09.051 "params": { 00:05:09.051 "discovery_filter": "match_any", 00:05:09.051 "admin_cmd_passthru": { 00:05:09.051 "identify_ctrlr": false 00:05:09.051 }, 00:05:09.051 "dhchap_digests": [ 00:05:09.051 "sha256", 00:05:09.051 "sha384", 00:05:09.051 "sha512" 00:05:09.051 ], 00:05:09.051 "dhchap_dhgroups": [ 00:05:09.051 "null", 00:05:09.051 "ffdhe2048", 00:05:09.051 "ffdhe3072", 00:05:09.051 "ffdhe4096", 00:05:09.051 "ffdhe6144", 00:05:09.051 "ffdhe8192" 00:05:09.051 ] 00:05:09.051 } 00:05:09.051 }, 00:05:09.051 { 00:05:09.051 "method": "nvmf_set_max_subsystems", 00:05:09.051 "params": { 00:05:09.051 "max_subsystems": 1024 00:05:09.051 } 00:05:09.051 }, 00:05:09.051 { 00:05:09.051 "method": "nvmf_set_crdt", 00:05:09.051 "params": { 00:05:09.051 "crdt1": 0, 00:05:09.051 "crdt2": 0, 00:05:09.051 "crdt3": 0 00:05:09.051 } 00:05:09.051 }, 00:05:09.051 { 00:05:09.051 "method": "nvmf_create_transport", 00:05:09.051 "params": { 00:05:09.051 "trtype": "TCP", 00:05:09.051 "max_queue_depth": 128, 00:05:09.051 "max_io_qpairs_per_ctrlr": 127, 00:05:09.051 "in_capsule_data_size": 4096, 00:05:09.051 "max_io_size": 131072, 00:05:09.051 "io_unit_size": 131072, 00:05:09.051 "max_aq_depth": 128, 00:05:09.051 "num_shared_buffers": 511, 00:05:09.051 "buf_cache_size": 4294967295, 00:05:09.051 "dif_insert_or_strip": false, 00:05:09.051 "zcopy": false, 00:05:09.051 "c2h_success": true, 00:05:09.051 "sock_priority": 0, 00:05:09.051 "abort_timeout_sec": 1, 00:05:09.051 "ack_timeout": 0, 00:05:09.051 "data_wr_pool_size": 0 00:05:09.051 } 00:05:09.051 } 00:05:09.051 ] 00:05:09.051 }, 00:05:09.051 { 00:05:09.051 "subsystem": "iscsi", 00:05:09.051 "config": [ 00:05:09.051 { 00:05:09.051 "method": "iscsi_set_options", 00:05:09.051 "params": { 00:05:09.051 "node_base": "iqn.2016-06.io.spdk", 00:05:09.051 "max_sessions": 128, 00:05:09.051 "max_connections_per_session": 2, 00:05:09.051 "max_queue_depth": 64, 00:05:09.051 "default_time2wait": 2, 00:05:09.051 "default_time2retain": 20, 00:05:09.051 "first_burst_length": 8192, 00:05:09.051 "immediate_data": true, 00:05:09.051 "allow_duplicated_isid": false, 00:05:09.051 "error_recovery_level": 0, 00:05:09.051 "nop_timeout": 60, 00:05:09.051 "nop_in_interval": 30, 00:05:09.051 "disable_chap": false, 00:05:09.051 "require_chap": false, 00:05:09.051 "mutual_chap": false, 00:05:09.051 "chap_group": 0, 00:05:09.051 "max_large_datain_per_connection": 64, 00:05:09.051 "max_r2t_per_connection": 4, 00:05:09.051 "pdu_pool_size": 36864, 00:05:09.051 "immediate_data_pool_size": 16384, 00:05:09.051 "data_out_pool_size": 2048 00:05:09.051 } 00:05:09.051 } 00:05:09.051 ] 00:05:09.051 } 00:05:09.051 ] 00:05:09.051 } 00:05:09.051 04:55:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:05:09.051 04:55:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 71495 00:05:09.051 04:55:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 71495 ']' 00:05:09.051 04:55:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 71495 00:05:09.051 04:55:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:05:09.051 04:55:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:09.051 04:55:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 71495 00:05:09.051 04:55:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:09.051 killing process with pid 71495 00:05:09.051 04:55:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:09.051 04:55:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 71495' 00:05:09.051 04:55:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 71495 00:05:09.051 04:55:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 71495 00:05:09.311 04:55:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=71518 00:05:09.312 04:55:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:05:09.312 04:55:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /home/vagrant/spdk_repo/spdk/test/rpc/config.json 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 71518 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' -z 71518 ']' 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # kill -0 71518 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # uname 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 71518 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:14.602 killing process with pid 71518 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # echo 'killing process with pid 71518' 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # kill 71518 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@978 -- # wait 71518 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /home/vagrant/spdk_repo/spdk/test/rpc/log.txt 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /home/vagrant/spdk_repo/spdk/test/rpc/log.txt 00:05:14.602 00:05:14.602 real 0m6.591s 00:05:14.602 user 0m6.316s 00:05:14.602 sys 0m0.513s 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:14.602 ************************************ 00:05:14.602 END TEST skip_rpc_with_json 00:05:14.602 ************************************ 00:05:14.602 04:55:34 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:05:14.602 04:55:34 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:14.602 04:55:34 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:14.602 04:55:34 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:14.602 ************************************ 00:05:14.602 START TEST skip_rpc_with_delay 00:05:14.602 ************************************ 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1129 -- # test_skip_rpc_with_delay 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # local es=0 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@654 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@646 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt ]] 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:14.602 [2024-12-15 04:55:34.640149] app.c: 842:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@655 -- # es=1 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:05:14.602 00:05:14.602 real 0m0.113s 00:05:14.602 user 0m0.065s 00:05:14.602 sys 0m0.047s 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:14.602 04:55:34 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:05:14.602 ************************************ 00:05:14.602 END TEST skip_rpc_with_delay 00:05:14.602 ************************************ 00:05:14.602 04:55:34 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:05:14.602 04:55:34 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:05:14.602 04:55:34 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:05:14.602 04:55:34 skip_rpc -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:14.602 04:55:34 skip_rpc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:14.602 04:55:34 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:14.602 ************************************ 00:05:14.602 START TEST exit_on_failed_rpc_init 00:05:14.602 ************************************ 00:05:14.602 04:55:34 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1129 -- # test_exit_on_failed_rpc_init 00:05:14.602 04:55:34 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=71630 00:05:14.602 04:55:34 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 71630 00:05:14.602 04:55:34 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # '[' -z 71630 ']' 00:05:14.602 04:55:34 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:14.602 04:55:34 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:14.602 04:55:34 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:14.602 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:14.602 04:55:34 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:14.602 04:55:34 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 00:05:14.602 04:55:34 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:14.863 [2024-12-15 04:55:34.798429] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:14.863 [2024-12-15 04:55:34.798813] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid71630 ] 00:05:14.863 [2024-12-15 04:55:34.952984] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:14.863 [2024-12-15 04:55:34.969478] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@868 -- # return 0 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x2 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # local es=0 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@654 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x2 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # local arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -t /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # type -P /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # arg=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@646 -- # [[ -x /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt ]] 00:05:15.807 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x2 00:05:15.807 [2024-12-15 04:55:35.702472] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:15.807 [2024-12-15 04:55:35.702577] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid71642 ] 00:05:15.807 [2024-12-15 04:55:35.860740] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:15.807 [2024-12-15 04:55:35.878565] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:05:15.807 [2024-12-15 04:55:35.878636] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:05:15.807 [2024-12-15 04:55:35.878652] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:05:15.807 [2024-12-15 04:55:35.878661] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@655 -- # es=234 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@664 -- # es=106 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@665 -- # case "$es" in 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@672 -- # es=1 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 71630 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # '[' -z 71630 ']' 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # kill -0 71630 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # uname 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 71630 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:16.068 killing process with pid 71630 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # echo 'killing process with pid 71630' 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@973 -- # kill 71630 00:05:16.068 04:55:35 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@978 -- # wait 71630 00:05:16.068 00:05:16.068 real 0m1.462s 00:05:16.068 user 0m1.636s 00:05:16.068 sys 0m0.342s 00:05:16.068 04:55:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:16.068 04:55:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:16.068 ************************************ 00:05:16.068 END TEST exit_on_failed_rpc_init 00:05:16.068 ************************************ 00:05:16.330 04:55:36 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /home/vagrant/spdk_repo/spdk/test/rpc/config.json 00:05:16.330 00:05:16.330 real 0m13.775s 00:05:16.330 user 0m13.041s 00:05:16.330 sys 0m1.366s 00:05:16.330 04:55:36 skip_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:16.330 04:55:36 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:16.330 ************************************ 00:05:16.330 END TEST skip_rpc 00:05:16.330 ************************************ 00:05:16.330 04:55:36 -- spdk/autotest.sh@158 -- # run_test rpc_client /home/vagrant/spdk_repo/spdk/test/rpc_client/rpc_client.sh 00:05:16.330 04:55:36 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:16.330 04:55:36 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:16.330 04:55:36 -- common/autotest_common.sh@10 -- # set +x 00:05:16.330 ************************************ 00:05:16.330 START TEST rpc_client 00:05:16.330 ************************************ 00:05:16.330 04:55:36 rpc_client -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/rpc_client/rpc_client.sh 00:05:16.330 * Looking for test storage... 00:05:16.330 * Found test storage at /home/vagrant/spdk_repo/spdk/test/rpc_client 00:05:16.330 04:55:36 rpc_client -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:05:16.330 04:55:36 rpc_client -- common/autotest_common.sh@1711 -- # lcov --version 00:05:16.330 04:55:36 rpc_client -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:05:16.330 04:55:36 rpc_client -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@336 -- # IFS=.-: 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@336 -- # read -ra ver1 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@337 -- # IFS=.-: 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@337 -- # read -ra ver2 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@338 -- # local 'op=<' 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@340 -- # ver1_l=2 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@341 -- # ver2_l=1 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@344 -- # case "$op" in 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@345 -- # : 1 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@365 -- # decimal 1 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@353 -- # local d=1 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@355 -- # echo 1 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@365 -- # ver1[v]=1 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@366 -- # decimal 2 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@353 -- # local d=2 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@355 -- # echo 2 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@366 -- # ver2[v]=2 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:16.330 04:55:36 rpc_client -- scripts/common.sh@368 -- # return 0 00:05:16.330 04:55:36 rpc_client -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:16.330 04:55:36 rpc_client -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:05:16.330 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.330 --rc genhtml_branch_coverage=1 00:05:16.330 --rc genhtml_function_coverage=1 00:05:16.330 --rc genhtml_legend=1 00:05:16.330 --rc geninfo_all_blocks=1 00:05:16.330 --rc geninfo_unexecuted_blocks=1 00:05:16.330 00:05:16.330 ' 00:05:16.330 04:55:36 rpc_client -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:05:16.330 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.330 --rc genhtml_branch_coverage=1 00:05:16.330 --rc genhtml_function_coverage=1 00:05:16.330 --rc genhtml_legend=1 00:05:16.330 --rc geninfo_all_blocks=1 00:05:16.330 --rc geninfo_unexecuted_blocks=1 00:05:16.330 00:05:16.330 ' 00:05:16.330 04:55:36 rpc_client -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:05:16.330 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.330 --rc genhtml_branch_coverage=1 00:05:16.330 --rc genhtml_function_coverage=1 00:05:16.330 --rc genhtml_legend=1 00:05:16.330 --rc geninfo_all_blocks=1 00:05:16.330 --rc geninfo_unexecuted_blocks=1 00:05:16.330 00:05:16.330 ' 00:05:16.330 04:55:36 rpc_client -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:05:16.331 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.331 --rc genhtml_branch_coverage=1 00:05:16.331 --rc genhtml_function_coverage=1 00:05:16.331 --rc genhtml_legend=1 00:05:16.331 --rc geninfo_all_blocks=1 00:05:16.331 --rc geninfo_unexecuted_blocks=1 00:05:16.331 00:05:16.331 ' 00:05:16.331 04:55:36 rpc_client -- rpc_client/rpc_client.sh@10 -- # /home/vagrant/spdk_repo/spdk/test/rpc_client/rpc_client_test 00:05:16.331 OK 00:05:16.331 04:55:36 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:16.331 00:05:16.331 real 0m0.171s 00:05:16.331 user 0m0.105s 00:05:16.331 sys 0m0.076s 00:05:16.331 04:55:36 rpc_client -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:16.331 04:55:36 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:05:16.331 ************************************ 00:05:16.331 END TEST rpc_client 00:05:16.331 ************************************ 00:05:16.331 04:55:36 -- spdk/autotest.sh@159 -- # run_test json_config /home/vagrant/spdk_repo/spdk/test/json_config/json_config.sh 00:05:16.331 04:55:36 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:16.331 04:55:36 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:16.331 04:55:36 -- common/autotest_common.sh@10 -- # set +x 00:05:16.331 ************************************ 00:05:16.331 START TEST json_config 00:05:16.331 ************************************ 00:05:16.331 04:55:36 json_config -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/json_config/json_config.sh 00:05:16.593 04:55:36 json_config -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:05:16.593 04:55:36 json_config -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:05:16.593 04:55:36 json_config -- common/autotest_common.sh@1711 -- # lcov --version 00:05:16.593 04:55:36 json_config -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:05:16.593 04:55:36 json_config -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:16.593 04:55:36 json_config -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:16.593 04:55:36 json_config -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:16.593 04:55:36 json_config -- scripts/common.sh@336 -- # IFS=.-: 00:05:16.593 04:55:36 json_config -- scripts/common.sh@336 -- # read -ra ver1 00:05:16.593 04:55:36 json_config -- scripts/common.sh@337 -- # IFS=.-: 00:05:16.593 04:55:36 json_config -- scripts/common.sh@337 -- # read -ra ver2 00:05:16.593 04:55:36 json_config -- scripts/common.sh@338 -- # local 'op=<' 00:05:16.593 04:55:36 json_config -- scripts/common.sh@340 -- # ver1_l=2 00:05:16.593 04:55:36 json_config -- scripts/common.sh@341 -- # ver2_l=1 00:05:16.593 04:55:36 json_config -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:16.593 04:55:36 json_config -- scripts/common.sh@344 -- # case "$op" in 00:05:16.593 04:55:36 json_config -- scripts/common.sh@345 -- # : 1 00:05:16.593 04:55:36 json_config -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:16.593 04:55:36 json_config -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:16.593 04:55:36 json_config -- scripts/common.sh@365 -- # decimal 1 00:05:16.593 04:55:36 json_config -- scripts/common.sh@353 -- # local d=1 00:05:16.593 04:55:36 json_config -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:16.593 04:55:36 json_config -- scripts/common.sh@355 -- # echo 1 00:05:16.593 04:55:36 json_config -- scripts/common.sh@365 -- # ver1[v]=1 00:05:16.593 04:55:36 json_config -- scripts/common.sh@366 -- # decimal 2 00:05:16.593 04:55:36 json_config -- scripts/common.sh@353 -- # local d=2 00:05:16.593 04:55:36 json_config -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:16.593 04:55:36 json_config -- scripts/common.sh@355 -- # echo 2 00:05:16.593 04:55:36 json_config -- scripts/common.sh@366 -- # ver2[v]=2 00:05:16.593 04:55:36 json_config -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:16.593 04:55:36 json_config -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:16.593 04:55:36 json_config -- scripts/common.sh@368 -- # return 0 00:05:16.593 04:55:36 json_config -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:16.593 04:55:36 json_config -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:05:16.593 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.593 --rc genhtml_branch_coverage=1 00:05:16.593 --rc genhtml_function_coverage=1 00:05:16.593 --rc genhtml_legend=1 00:05:16.593 --rc geninfo_all_blocks=1 00:05:16.593 --rc geninfo_unexecuted_blocks=1 00:05:16.593 00:05:16.593 ' 00:05:16.593 04:55:36 json_config -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:05:16.593 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.593 --rc genhtml_branch_coverage=1 00:05:16.593 --rc genhtml_function_coverage=1 00:05:16.593 --rc genhtml_legend=1 00:05:16.593 --rc geninfo_all_blocks=1 00:05:16.593 --rc geninfo_unexecuted_blocks=1 00:05:16.593 00:05:16.593 ' 00:05:16.593 04:55:36 json_config -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:05:16.593 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.593 --rc genhtml_branch_coverage=1 00:05:16.593 --rc genhtml_function_coverage=1 00:05:16.593 --rc genhtml_legend=1 00:05:16.593 --rc geninfo_all_blocks=1 00:05:16.593 --rc geninfo_unexecuted_blocks=1 00:05:16.593 00:05:16.593 ' 00:05:16.593 04:55:36 json_config -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:05:16.593 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.593 --rc genhtml_branch_coverage=1 00:05:16.593 --rc genhtml_function_coverage=1 00:05:16.593 --rc genhtml_legend=1 00:05:16.593 --rc geninfo_all_blocks=1 00:05:16.593 --rc geninfo_unexecuted_blocks=1 00:05:16.593 00:05:16.593 ' 00:05:16.593 04:55:36 json_config -- json_config/json_config.sh@8 -- # source /home/vagrant/spdk_repo/spdk/test/nvmf/common.sh 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@7 -- # uname -s 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:9174b0f4-4dff-4414-95f3-547baa722471 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=9174b0f4-4dff-4414-95f3-547baa722471 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@49 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:05:16.593 04:55:36 json_config -- scripts/common.sh@15 -- # shopt -s extglob 00:05:16.593 04:55:36 json_config -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:16.593 04:55:36 json_config -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:16.593 04:55:36 json_config -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:16.593 04:55:36 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:16.593 04:55:36 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:16.593 04:55:36 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:16.593 04:55:36 json_config -- paths/export.sh@5 -- # export PATH 00:05:16.593 04:55:36 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@51 -- # : 0 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:05:16.593 /home/vagrant/spdk_repo/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:05:16.593 04:55:36 json_config -- nvmf/common.sh@55 -- # have_pci_nics=0 00:05:16.593 04:55:36 json_config -- json_config/json_config.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/json_config/common.sh 00:05:16.593 04:55:36 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:05:16.593 04:55:36 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:05:16.593 04:55:36 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:05:16.593 04:55:36 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:16.593 04:55:36 json_config -- json_config/json_config.sh@27 -- # echo 'WARNING: No tests are enabled so not running JSON configuration tests' 00:05:16.593 WARNING: No tests are enabled so not running JSON configuration tests 00:05:16.593 04:55:36 json_config -- json_config/json_config.sh@28 -- # exit 0 00:05:16.593 00:05:16.593 real 0m0.135s 00:05:16.593 user 0m0.086s 00:05:16.593 sys 0m0.051s 00:05:16.593 04:55:36 json_config -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:16.593 04:55:36 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:16.593 ************************************ 00:05:16.593 END TEST json_config 00:05:16.593 ************************************ 00:05:16.593 04:55:36 -- spdk/autotest.sh@160 -- # run_test json_config_extra_key /home/vagrant/spdk_repo/spdk/test/json_config/json_config_extra_key.sh 00:05:16.593 04:55:36 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:16.593 04:55:36 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:16.593 04:55:36 -- common/autotest_common.sh@10 -- # set +x 00:05:16.593 ************************************ 00:05:16.593 START TEST json_config_extra_key 00:05:16.593 ************************************ 00:05:16.593 04:55:36 json_config_extra_key -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/json_config/json_config_extra_key.sh 00:05:16.593 04:55:36 json_config_extra_key -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:05:16.593 04:55:36 json_config_extra_key -- common/autotest_common.sh@1711 -- # lcov --version 00:05:16.593 04:55:36 json_config_extra_key -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:05:16.593 04:55:36 json_config_extra_key -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:05:16.593 04:55:36 json_config_extra_key -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:16.593 04:55:36 json_config_extra_key -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:16.593 04:55:36 json_config_extra_key -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:16.593 04:55:36 json_config_extra_key -- scripts/common.sh@336 -- # IFS=.-: 00:05:16.593 04:55:36 json_config_extra_key -- scripts/common.sh@336 -- # read -ra ver1 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@337 -- # IFS=.-: 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@337 -- # read -ra ver2 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@338 -- # local 'op=<' 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@340 -- # ver1_l=2 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@341 -- # ver2_l=1 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@344 -- # case "$op" in 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@345 -- # : 1 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@365 -- # decimal 1 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@353 -- # local d=1 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@355 -- # echo 1 00:05:16.594 04:55:36 json_config_extra_key -- scripts/common.sh@365 -- # ver1[v]=1 00:05:16.855 04:55:36 json_config_extra_key -- scripts/common.sh@366 -- # decimal 2 00:05:16.855 04:55:36 json_config_extra_key -- scripts/common.sh@353 -- # local d=2 00:05:16.855 04:55:36 json_config_extra_key -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:16.855 04:55:36 json_config_extra_key -- scripts/common.sh@355 -- # echo 2 00:05:16.855 04:55:36 json_config_extra_key -- scripts/common.sh@366 -- # ver2[v]=2 00:05:16.855 04:55:36 json_config_extra_key -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:16.855 04:55:36 json_config_extra_key -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:16.855 04:55:36 json_config_extra_key -- scripts/common.sh@368 -- # return 0 00:05:16.855 04:55:36 json_config_extra_key -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:16.855 04:55:36 json_config_extra_key -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:05:16.855 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.855 --rc genhtml_branch_coverage=1 00:05:16.855 --rc genhtml_function_coverage=1 00:05:16.855 --rc genhtml_legend=1 00:05:16.855 --rc geninfo_all_blocks=1 00:05:16.855 --rc geninfo_unexecuted_blocks=1 00:05:16.855 00:05:16.855 ' 00:05:16.855 04:55:36 json_config_extra_key -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:05:16.855 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.855 --rc genhtml_branch_coverage=1 00:05:16.855 --rc genhtml_function_coverage=1 00:05:16.855 --rc genhtml_legend=1 00:05:16.855 --rc geninfo_all_blocks=1 00:05:16.855 --rc geninfo_unexecuted_blocks=1 00:05:16.855 00:05:16.855 ' 00:05:16.855 04:55:36 json_config_extra_key -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:05:16.855 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.855 --rc genhtml_branch_coverage=1 00:05:16.855 --rc genhtml_function_coverage=1 00:05:16.855 --rc genhtml_legend=1 00:05:16.855 --rc geninfo_all_blocks=1 00:05:16.855 --rc geninfo_unexecuted_blocks=1 00:05:16.855 00:05:16.855 ' 00:05:16.855 04:55:36 json_config_extra_key -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:05:16.855 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:16.855 --rc genhtml_branch_coverage=1 00:05:16.855 --rc genhtml_function_coverage=1 00:05:16.855 --rc genhtml_legend=1 00:05:16.855 --rc geninfo_all_blocks=1 00:05:16.855 --rc geninfo_unexecuted_blocks=1 00:05:16.855 00:05:16.855 ' 00:05:16.855 04:55:36 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/nvmf/common.sh 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:9174b0f4-4dff-4414-95f3-547baa722471 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=9174b0f4-4dff-4414-95f3-547baa722471 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:16.855 04:55:36 json_config_extra_key -- nvmf/common.sh@49 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:05:16.855 04:55:36 json_config_extra_key -- scripts/common.sh@15 -- # shopt -s extglob 00:05:16.855 04:55:36 json_config_extra_key -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:16.855 04:55:36 json_config_extra_key -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:16.855 04:55:36 json_config_extra_key -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:16.856 04:55:36 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:16.856 04:55:36 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:16.856 04:55:36 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:16.856 04:55:36 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:05:16.856 04:55:36 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:16.856 04:55:36 json_config_extra_key -- nvmf/common.sh@51 -- # : 0 00:05:16.856 04:55:36 json_config_extra_key -- nvmf/common.sh@52 -- # export NVMF_APP_SHM_ID 00:05:16.856 04:55:36 json_config_extra_key -- nvmf/common.sh@53 -- # build_nvmf_app_args 00:05:16.856 04:55:36 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:16.856 04:55:36 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:16.856 04:55:36 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:16.856 04:55:36 json_config_extra_key -- nvmf/common.sh@33 -- # '[' '' -eq 1 ']' 00:05:16.856 /home/vagrant/spdk_repo/spdk/test/nvmf/common.sh: line 33: [: : integer expression expected 00:05:16.856 04:55:36 json_config_extra_key -- nvmf/common.sh@37 -- # '[' -n '' ']' 00:05:16.856 04:55:36 json_config_extra_key -- nvmf/common.sh@39 -- # '[' 0 -eq 1 ']' 00:05:16.856 04:55:36 json_config_extra_key -- nvmf/common.sh@55 -- # have_pci_nics=0 00:05:16.856 04:55:36 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /home/vagrant/spdk_repo/spdk/test/json_config/common.sh 00:05:16.856 04:55:36 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:05:16.856 04:55:36 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:05:16.856 04:55:36 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:05:16.856 04:55:36 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:05:16.856 04:55:36 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:05:16.856 04:55:36 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:05:16.856 04:55:36 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/home/vagrant/spdk_repo/spdk/test/json_config/extra_key.json') 00:05:16.856 04:55:36 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:05:16.856 04:55:36 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:16.856 INFO: launching applications... 00:05:16.856 04:55:36 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:05:16.856 04:55:36 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /home/vagrant/spdk_repo/spdk/test/json_config/extra_key.json 00:05:16.856 04:55:36 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:05:16.856 04:55:36 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:05:16.856 04:55:36 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:16.856 04:55:36 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:16.856 04:55:36 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:05:16.856 04:55:36 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:16.856 04:55:36 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:16.856 04:55:36 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=71825 00:05:16.856 Waiting for target to run... 00:05:16.856 04:55:36 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:16.856 04:55:36 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 71825 /var/tmp/spdk_tgt.sock 00:05:16.856 04:55:36 json_config_extra_key -- common/autotest_common.sh@835 -- # '[' -z 71825 ']' 00:05:16.856 04:55:36 json_config_extra_key -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:16.856 04:55:36 json_config_extra_key -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:16.856 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:16.856 04:55:36 json_config_extra_key -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:16.856 04:55:36 json_config_extra_key -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:16.856 04:55:36 json_config_extra_key -- json_config/common.sh@21 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /home/vagrant/spdk_repo/spdk/test/json_config/extra_key.json 00:05:16.856 04:55:36 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:05:16.856 [2024-12-15 04:55:36.825301] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:16.856 [2024-12-15 04:55:36.825418] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid71825 ] 00:05:17.117 [2024-12-15 04:55:37.135027] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:17.117 [2024-12-15 04:55:37.145767] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:17.689 04:55:37 json_config_extra_key -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:17.689 04:55:37 json_config_extra_key -- common/autotest_common.sh@868 -- # return 0 00:05:17.689 00:05:17.689 04:55:37 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:05:17.689 INFO: shutting down applications... 00:05:17.689 04:55:37 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:05:17.689 04:55:37 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:05:17.689 04:55:37 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:05:17.689 04:55:37 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:05:17.689 04:55:37 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 71825 ]] 00:05:17.689 04:55:37 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 71825 00:05:17.689 04:55:37 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:05:17.689 04:55:37 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:17.689 04:55:37 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 71825 00:05:17.689 04:55:37 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:05:18.309 04:55:38 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:05:18.309 04:55:38 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:18.309 04:55:38 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 71825 00:05:18.309 04:55:38 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:05:18.309 04:55:38 json_config_extra_key -- json_config/common.sh@43 -- # break 00:05:18.309 SPDK target shutdown done 00:05:18.309 04:55:38 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:05:18.309 04:55:38 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:05:18.309 Success 00:05:18.309 04:55:38 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:05:18.309 00:05:18.309 real 0m1.528s 00:05:18.309 user 0m1.213s 00:05:18.309 sys 0m0.347s 00:05:18.309 04:55:38 json_config_extra_key -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:18.309 04:55:38 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:05:18.309 ************************************ 00:05:18.309 END TEST json_config_extra_key 00:05:18.309 ************************************ 00:05:18.309 04:55:38 -- spdk/autotest.sh@161 -- # run_test alias_rpc /home/vagrant/spdk_repo/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:18.309 04:55:38 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:18.309 04:55:38 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:18.309 04:55:38 -- common/autotest_common.sh@10 -- # set +x 00:05:18.309 ************************************ 00:05:18.309 START TEST alias_rpc 00:05:18.309 ************************************ 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:05:18.309 * Looking for test storage... 00:05:18.309 * Found test storage at /home/vagrant/spdk_repo/spdk/test/json_config/alias_rpc 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@1711 -- # lcov --version 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@344 -- # case "$op" in 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@345 -- # : 1 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@365 -- # decimal 1 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@353 -- # local d=1 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@355 -- # echo 1 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@366 -- # decimal 2 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@353 -- # local d=2 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@355 -- # echo 2 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:18.309 04:55:38 alias_rpc -- scripts/common.sh@368 -- # return 0 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:05:18.309 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:18.309 --rc genhtml_branch_coverage=1 00:05:18.309 --rc genhtml_function_coverage=1 00:05:18.309 --rc genhtml_legend=1 00:05:18.309 --rc geninfo_all_blocks=1 00:05:18.309 --rc geninfo_unexecuted_blocks=1 00:05:18.309 00:05:18.309 ' 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:05:18.309 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:18.309 --rc genhtml_branch_coverage=1 00:05:18.309 --rc genhtml_function_coverage=1 00:05:18.309 --rc genhtml_legend=1 00:05:18.309 --rc geninfo_all_blocks=1 00:05:18.309 --rc geninfo_unexecuted_blocks=1 00:05:18.309 00:05:18.309 ' 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:05:18.309 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:18.309 --rc genhtml_branch_coverage=1 00:05:18.309 --rc genhtml_function_coverage=1 00:05:18.309 --rc genhtml_legend=1 00:05:18.309 --rc geninfo_all_blocks=1 00:05:18.309 --rc geninfo_unexecuted_blocks=1 00:05:18.309 00:05:18.309 ' 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:05:18.309 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:18.309 --rc genhtml_branch_coverage=1 00:05:18.309 --rc genhtml_function_coverage=1 00:05:18.309 --rc genhtml_legend=1 00:05:18.309 --rc geninfo_all_blocks=1 00:05:18.309 --rc geninfo_unexecuted_blocks=1 00:05:18.309 00:05:18.309 ' 00:05:18.309 04:55:38 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:05:18.309 04:55:38 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=71898 00:05:18.309 04:55:38 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 71898 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@835 -- # '[' -z 71898 ']' 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:18.309 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:18.309 04:55:38 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:18.309 04:55:38 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:05:18.309 [2024-12-15 04:55:38.416507] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:18.309 [2024-12-15 04:55:38.416620] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid71898 ] 00:05:18.574 [2024-12-15 04:55:38.566240] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:18.574 [2024-12-15 04:55:38.584114] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:19.144 04:55:39 alias_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:19.144 04:55:39 alias_rpc -- common/autotest_common.sh@868 -- # return 0 00:05:19.144 04:55:39 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py load_config -i 00:05:19.404 04:55:39 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 71898 00:05:19.404 04:55:39 alias_rpc -- common/autotest_common.sh@954 -- # '[' -z 71898 ']' 00:05:19.404 04:55:39 alias_rpc -- common/autotest_common.sh@958 -- # kill -0 71898 00:05:19.404 04:55:39 alias_rpc -- common/autotest_common.sh@959 -- # uname 00:05:19.404 04:55:39 alias_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:19.404 04:55:39 alias_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 71898 00:05:19.404 04:55:39 alias_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:19.404 killing process with pid 71898 00:05:19.404 04:55:39 alias_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:19.404 04:55:39 alias_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 71898' 00:05:19.404 04:55:39 alias_rpc -- common/autotest_common.sh@973 -- # kill 71898 00:05:19.404 04:55:39 alias_rpc -- common/autotest_common.sh@978 -- # wait 71898 00:05:19.664 00:05:19.664 real 0m1.542s 00:05:19.664 user 0m1.686s 00:05:19.664 sys 0m0.361s 00:05:19.664 ************************************ 00:05:19.664 END TEST alias_rpc 00:05:19.664 ************************************ 00:05:19.664 04:55:39 alias_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:19.664 04:55:39 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:19.664 04:55:39 -- spdk/autotest.sh@163 -- # [[ 0 -eq 0 ]] 00:05:19.664 04:55:39 -- spdk/autotest.sh@164 -- # run_test spdkcli_tcp /home/vagrant/spdk_repo/spdk/test/spdkcli/tcp.sh 00:05:19.664 04:55:39 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:19.664 04:55:39 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:19.664 04:55:39 -- common/autotest_common.sh@10 -- # set +x 00:05:19.664 ************************************ 00:05:19.664 START TEST spdkcli_tcp 00:05:19.664 ************************************ 00:05:19.664 04:55:39 spdkcli_tcp -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/spdkcli/tcp.sh 00:05:19.926 * Looking for test storage... 00:05:19.926 * Found test storage at /home/vagrant/spdk_repo/spdk/test/spdkcli 00:05:19.926 04:55:39 spdkcli_tcp -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:05:19.926 04:55:39 spdkcli_tcp -- common/autotest_common.sh@1711 -- # lcov --version 00:05:19.926 04:55:39 spdkcli_tcp -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:05:19.926 04:55:39 spdkcli_tcp -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@336 -- # IFS=.-: 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@336 -- # read -ra ver1 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@337 -- # IFS=.-: 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@337 -- # read -ra ver2 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@338 -- # local 'op=<' 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@340 -- # ver1_l=2 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@341 -- # ver2_l=1 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@344 -- # case "$op" in 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@345 -- # : 1 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@365 -- # decimal 1 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@353 -- # local d=1 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@355 -- # echo 1 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@365 -- # ver1[v]=1 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@366 -- # decimal 2 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@353 -- # local d=2 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@355 -- # echo 2 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@366 -- # ver2[v]=2 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:19.926 04:55:39 spdkcli_tcp -- scripts/common.sh@368 -- # return 0 00:05:19.926 04:55:39 spdkcli_tcp -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:19.926 04:55:39 spdkcli_tcp -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:05:19.926 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:19.926 --rc genhtml_branch_coverage=1 00:05:19.926 --rc genhtml_function_coverage=1 00:05:19.926 --rc genhtml_legend=1 00:05:19.926 --rc geninfo_all_blocks=1 00:05:19.926 --rc geninfo_unexecuted_blocks=1 00:05:19.926 00:05:19.926 ' 00:05:19.926 04:55:39 spdkcli_tcp -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:05:19.926 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:19.926 --rc genhtml_branch_coverage=1 00:05:19.926 --rc genhtml_function_coverage=1 00:05:19.926 --rc genhtml_legend=1 00:05:19.927 --rc geninfo_all_blocks=1 00:05:19.927 --rc geninfo_unexecuted_blocks=1 00:05:19.927 00:05:19.927 ' 00:05:19.927 04:55:39 spdkcli_tcp -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:05:19.927 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:19.927 --rc genhtml_branch_coverage=1 00:05:19.927 --rc genhtml_function_coverage=1 00:05:19.927 --rc genhtml_legend=1 00:05:19.927 --rc geninfo_all_blocks=1 00:05:19.927 --rc geninfo_unexecuted_blocks=1 00:05:19.927 00:05:19.927 ' 00:05:19.927 04:55:39 spdkcli_tcp -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:05:19.927 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:19.927 --rc genhtml_branch_coverage=1 00:05:19.927 --rc genhtml_function_coverage=1 00:05:19.927 --rc genhtml_legend=1 00:05:19.927 --rc geninfo_all_blocks=1 00:05:19.927 --rc geninfo_unexecuted_blocks=1 00:05:19.927 00:05:19.927 ' 00:05:19.927 04:55:39 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/spdkcli/common.sh 00:05:19.927 04:55:39 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/home/vagrant/spdk_repo/spdk/test/spdkcli/spdkcli_job.py 00:05:19.927 04:55:39 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/home/vagrant/spdk_repo/spdk/test/json_config/clear_config.py 00:05:19.927 04:55:39 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:05:19.927 04:55:39 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:05:19.927 04:55:39 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:05:19.927 04:55:39 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:05:19.927 04:55:39 spdkcli_tcp -- common/autotest_common.sh@726 -- # xtrace_disable 00:05:19.927 04:55:39 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:19.927 04:55:39 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=71978 00:05:19.927 04:55:39 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 71978 00:05:19.927 04:55:39 spdkcli_tcp -- common/autotest_common.sh@835 -- # '[' -z 71978 ']' 00:05:19.927 04:55:39 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:05:19.927 04:55:39 spdkcli_tcp -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:19.927 04:55:39 spdkcli_tcp -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:19.927 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:19.927 04:55:39 spdkcli_tcp -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:19.927 04:55:39 spdkcli_tcp -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:19.927 04:55:39 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:19.927 [2024-12-15 04:55:40.017171] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:19.927 [2024-12-15 04:55:40.017283] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid71978 ] 00:05:20.188 [2024-12-15 04:55:40.172342] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:20.188 [2024-12-15 04:55:40.191795] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:20.188 [2024-12-15 04:55:40.191828] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:05:20.760 04:55:40 spdkcli_tcp -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:20.760 04:55:40 spdkcli_tcp -- common/autotest_common.sh@868 -- # return 0 00:05:20.760 04:55:40 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=71995 00:05:20.760 04:55:40 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:05:20.760 04:55:40 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:05:21.023 [ 00:05:21.023 "bdev_malloc_delete", 00:05:21.023 "bdev_malloc_create", 00:05:21.023 "bdev_null_resize", 00:05:21.023 "bdev_null_delete", 00:05:21.023 "bdev_null_create", 00:05:21.023 "bdev_nvme_cuse_unregister", 00:05:21.023 "bdev_nvme_cuse_register", 00:05:21.023 "bdev_opal_new_user", 00:05:21.023 "bdev_opal_set_lock_state", 00:05:21.023 "bdev_opal_delete", 00:05:21.023 "bdev_opal_get_info", 00:05:21.023 "bdev_opal_create", 00:05:21.023 "bdev_nvme_opal_revert", 00:05:21.023 "bdev_nvme_opal_init", 00:05:21.023 "bdev_nvme_send_cmd", 00:05:21.023 "bdev_nvme_set_keys", 00:05:21.023 "bdev_nvme_get_path_iostat", 00:05:21.023 "bdev_nvme_get_mdns_discovery_info", 00:05:21.023 "bdev_nvme_stop_mdns_discovery", 00:05:21.023 "bdev_nvme_start_mdns_discovery", 00:05:21.023 "bdev_nvme_set_multipath_policy", 00:05:21.023 "bdev_nvme_set_preferred_path", 00:05:21.023 "bdev_nvme_get_io_paths", 00:05:21.023 "bdev_nvme_remove_error_injection", 00:05:21.023 "bdev_nvme_add_error_injection", 00:05:21.023 "bdev_nvme_get_discovery_info", 00:05:21.023 "bdev_nvme_stop_discovery", 00:05:21.023 "bdev_nvme_start_discovery", 00:05:21.023 "bdev_nvme_get_controller_health_info", 00:05:21.023 "bdev_nvme_disable_controller", 00:05:21.023 "bdev_nvme_enable_controller", 00:05:21.023 "bdev_nvme_reset_controller", 00:05:21.023 "bdev_nvme_get_transport_statistics", 00:05:21.023 "bdev_nvme_apply_firmware", 00:05:21.023 "bdev_nvme_detach_controller", 00:05:21.023 "bdev_nvme_get_controllers", 00:05:21.023 "bdev_nvme_attach_controller", 00:05:21.023 "bdev_nvme_set_hotplug", 00:05:21.023 "bdev_nvme_set_options", 00:05:21.023 "bdev_passthru_delete", 00:05:21.023 "bdev_passthru_create", 00:05:21.023 "bdev_lvol_set_parent_bdev", 00:05:21.023 "bdev_lvol_set_parent", 00:05:21.023 "bdev_lvol_check_shallow_copy", 00:05:21.023 "bdev_lvol_start_shallow_copy", 00:05:21.023 "bdev_lvol_grow_lvstore", 00:05:21.023 "bdev_lvol_get_lvols", 00:05:21.023 "bdev_lvol_get_lvstores", 00:05:21.023 "bdev_lvol_delete", 00:05:21.023 "bdev_lvol_set_read_only", 00:05:21.023 "bdev_lvol_resize", 00:05:21.023 "bdev_lvol_decouple_parent", 00:05:21.023 "bdev_lvol_inflate", 00:05:21.023 "bdev_lvol_rename", 00:05:21.023 "bdev_lvol_clone_bdev", 00:05:21.023 "bdev_lvol_clone", 00:05:21.023 "bdev_lvol_snapshot", 00:05:21.023 "bdev_lvol_create", 00:05:21.023 "bdev_lvol_delete_lvstore", 00:05:21.023 "bdev_lvol_rename_lvstore", 00:05:21.023 "bdev_lvol_create_lvstore", 00:05:21.023 "bdev_raid_set_options", 00:05:21.023 "bdev_raid_remove_base_bdev", 00:05:21.023 "bdev_raid_add_base_bdev", 00:05:21.023 "bdev_raid_delete", 00:05:21.023 "bdev_raid_create", 00:05:21.023 "bdev_raid_get_bdevs", 00:05:21.023 "bdev_error_inject_error", 00:05:21.023 "bdev_error_delete", 00:05:21.023 "bdev_error_create", 00:05:21.023 "bdev_split_delete", 00:05:21.023 "bdev_split_create", 00:05:21.023 "bdev_delay_delete", 00:05:21.023 "bdev_delay_create", 00:05:21.023 "bdev_delay_update_latency", 00:05:21.023 "bdev_zone_block_delete", 00:05:21.023 "bdev_zone_block_create", 00:05:21.023 "blobfs_create", 00:05:21.023 "blobfs_detect", 00:05:21.023 "blobfs_set_cache_size", 00:05:21.023 "bdev_xnvme_delete", 00:05:21.023 "bdev_xnvme_create", 00:05:21.023 "bdev_aio_delete", 00:05:21.023 "bdev_aio_rescan", 00:05:21.023 "bdev_aio_create", 00:05:21.023 "bdev_ftl_set_property", 00:05:21.023 "bdev_ftl_get_properties", 00:05:21.023 "bdev_ftl_get_stats", 00:05:21.023 "bdev_ftl_unmap", 00:05:21.023 "bdev_ftl_unload", 00:05:21.023 "bdev_ftl_delete", 00:05:21.023 "bdev_ftl_load", 00:05:21.023 "bdev_ftl_create", 00:05:21.023 "bdev_virtio_attach_controller", 00:05:21.023 "bdev_virtio_scsi_get_devices", 00:05:21.023 "bdev_virtio_detach_controller", 00:05:21.023 "bdev_virtio_blk_set_hotplug", 00:05:21.023 "bdev_iscsi_delete", 00:05:21.023 "bdev_iscsi_create", 00:05:21.023 "bdev_iscsi_set_options", 00:05:21.023 "accel_error_inject_error", 00:05:21.023 "ioat_scan_accel_module", 00:05:21.023 "dsa_scan_accel_module", 00:05:21.023 "iaa_scan_accel_module", 00:05:21.023 "keyring_file_remove_key", 00:05:21.023 "keyring_file_add_key", 00:05:21.023 "keyring_linux_set_options", 00:05:21.023 "fsdev_aio_delete", 00:05:21.023 "fsdev_aio_create", 00:05:21.023 "iscsi_get_histogram", 00:05:21.023 "iscsi_enable_histogram", 00:05:21.023 "iscsi_set_options", 00:05:21.023 "iscsi_get_auth_groups", 00:05:21.023 "iscsi_auth_group_remove_secret", 00:05:21.023 "iscsi_auth_group_add_secret", 00:05:21.023 "iscsi_delete_auth_group", 00:05:21.023 "iscsi_create_auth_group", 00:05:21.023 "iscsi_set_discovery_auth", 00:05:21.023 "iscsi_get_options", 00:05:21.023 "iscsi_target_node_request_logout", 00:05:21.023 "iscsi_target_node_set_redirect", 00:05:21.023 "iscsi_target_node_set_auth", 00:05:21.023 "iscsi_target_node_add_lun", 00:05:21.023 "iscsi_get_stats", 00:05:21.023 "iscsi_get_connections", 00:05:21.023 "iscsi_portal_group_set_auth", 00:05:21.023 "iscsi_start_portal_group", 00:05:21.023 "iscsi_delete_portal_group", 00:05:21.023 "iscsi_create_portal_group", 00:05:21.023 "iscsi_get_portal_groups", 00:05:21.023 "iscsi_delete_target_node", 00:05:21.023 "iscsi_target_node_remove_pg_ig_maps", 00:05:21.023 "iscsi_target_node_add_pg_ig_maps", 00:05:21.023 "iscsi_create_target_node", 00:05:21.023 "iscsi_get_target_nodes", 00:05:21.023 "iscsi_delete_initiator_group", 00:05:21.024 "iscsi_initiator_group_remove_initiators", 00:05:21.024 "iscsi_initiator_group_add_initiators", 00:05:21.024 "iscsi_create_initiator_group", 00:05:21.024 "iscsi_get_initiator_groups", 00:05:21.024 "nvmf_set_crdt", 00:05:21.024 "nvmf_set_config", 00:05:21.024 "nvmf_set_max_subsystems", 00:05:21.024 "nvmf_stop_mdns_prr", 00:05:21.024 "nvmf_publish_mdns_prr", 00:05:21.024 "nvmf_subsystem_get_listeners", 00:05:21.024 "nvmf_subsystem_get_qpairs", 00:05:21.024 "nvmf_subsystem_get_controllers", 00:05:21.024 "nvmf_get_stats", 00:05:21.024 "nvmf_get_transports", 00:05:21.024 "nvmf_create_transport", 00:05:21.024 "nvmf_get_targets", 00:05:21.024 "nvmf_delete_target", 00:05:21.024 "nvmf_create_target", 00:05:21.024 "nvmf_subsystem_allow_any_host", 00:05:21.024 "nvmf_subsystem_set_keys", 00:05:21.024 "nvmf_subsystem_remove_host", 00:05:21.024 "nvmf_subsystem_add_host", 00:05:21.024 "nvmf_ns_remove_host", 00:05:21.024 "nvmf_ns_add_host", 00:05:21.024 "nvmf_subsystem_remove_ns", 00:05:21.024 "nvmf_subsystem_set_ns_ana_group", 00:05:21.024 "nvmf_subsystem_add_ns", 00:05:21.024 "nvmf_subsystem_listener_set_ana_state", 00:05:21.024 "nvmf_discovery_get_referrals", 00:05:21.024 "nvmf_discovery_remove_referral", 00:05:21.024 "nvmf_discovery_add_referral", 00:05:21.024 "nvmf_subsystem_remove_listener", 00:05:21.024 "nvmf_subsystem_add_listener", 00:05:21.024 "nvmf_delete_subsystem", 00:05:21.024 "nvmf_create_subsystem", 00:05:21.024 "nvmf_get_subsystems", 00:05:21.024 "env_dpdk_get_mem_stats", 00:05:21.024 "nbd_get_disks", 00:05:21.024 "nbd_stop_disk", 00:05:21.024 "nbd_start_disk", 00:05:21.024 "ublk_recover_disk", 00:05:21.024 "ublk_get_disks", 00:05:21.024 "ublk_stop_disk", 00:05:21.024 "ublk_start_disk", 00:05:21.024 "ublk_destroy_target", 00:05:21.024 "ublk_create_target", 00:05:21.024 "virtio_blk_create_transport", 00:05:21.024 "virtio_blk_get_transports", 00:05:21.024 "vhost_controller_set_coalescing", 00:05:21.024 "vhost_get_controllers", 00:05:21.024 "vhost_delete_controller", 00:05:21.024 "vhost_create_blk_controller", 00:05:21.024 "vhost_scsi_controller_remove_target", 00:05:21.024 "vhost_scsi_controller_add_target", 00:05:21.024 "vhost_start_scsi_controller", 00:05:21.024 "vhost_create_scsi_controller", 00:05:21.024 "thread_set_cpumask", 00:05:21.024 "scheduler_set_options", 00:05:21.024 "framework_get_governor", 00:05:21.024 "framework_get_scheduler", 00:05:21.024 "framework_set_scheduler", 00:05:21.024 "framework_get_reactors", 00:05:21.024 "thread_get_io_channels", 00:05:21.024 "thread_get_pollers", 00:05:21.024 "thread_get_stats", 00:05:21.024 "framework_monitor_context_switch", 00:05:21.024 "spdk_kill_instance", 00:05:21.024 "log_enable_timestamps", 00:05:21.024 "log_get_flags", 00:05:21.024 "log_clear_flag", 00:05:21.024 "log_set_flag", 00:05:21.024 "log_get_level", 00:05:21.024 "log_set_level", 00:05:21.024 "log_get_print_level", 00:05:21.024 "log_set_print_level", 00:05:21.024 "framework_enable_cpumask_locks", 00:05:21.024 "framework_disable_cpumask_locks", 00:05:21.024 "framework_wait_init", 00:05:21.024 "framework_start_init", 00:05:21.024 "scsi_get_devices", 00:05:21.024 "bdev_get_histogram", 00:05:21.024 "bdev_enable_histogram", 00:05:21.024 "bdev_set_qos_limit", 00:05:21.024 "bdev_set_qd_sampling_period", 00:05:21.024 "bdev_get_bdevs", 00:05:21.024 "bdev_reset_iostat", 00:05:21.024 "bdev_get_iostat", 00:05:21.024 "bdev_examine", 00:05:21.024 "bdev_wait_for_examine", 00:05:21.024 "bdev_set_options", 00:05:21.024 "accel_get_stats", 00:05:21.024 "accel_set_options", 00:05:21.024 "accel_set_driver", 00:05:21.024 "accel_crypto_key_destroy", 00:05:21.024 "accel_crypto_keys_get", 00:05:21.024 "accel_crypto_key_create", 00:05:21.024 "accel_assign_opc", 00:05:21.024 "accel_get_module_info", 00:05:21.024 "accel_get_opc_assignments", 00:05:21.024 "vmd_rescan", 00:05:21.024 "vmd_remove_device", 00:05:21.024 "vmd_enable", 00:05:21.024 "sock_get_default_impl", 00:05:21.024 "sock_set_default_impl", 00:05:21.024 "sock_impl_set_options", 00:05:21.024 "sock_impl_get_options", 00:05:21.024 "iobuf_get_stats", 00:05:21.024 "iobuf_set_options", 00:05:21.024 "keyring_get_keys", 00:05:21.024 "framework_get_pci_devices", 00:05:21.024 "framework_get_config", 00:05:21.024 "framework_get_subsystems", 00:05:21.024 "fsdev_set_opts", 00:05:21.024 "fsdev_get_opts", 00:05:21.024 "trace_get_info", 00:05:21.024 "trace_get_tpoint_group_mask", 00:05:21.024 "trace_disable_tpoint_group", 00:05:21.024 "trace_enable_tpoint_group", 00:05:21.024 "trace_clear_tpoint_mask", 00:05:21.024 "trace_set_tpoint_mask", 00:05:21.024 "notify_get_notifications", 00:05:21.024 "notify_get_types", 00:05:21.024 "spdk_get_version", 00:05:21.024 "rpc_get_methods" 00:05:21.024 ] 00:05:21.024 04:55:41 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:05:21.024 04:55:41 spdkcli_tcp -- common/autotest_common.sh@732 -- # xtrace_disable 00:05:21.024 04:55:41 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:21.024 04:55:41 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:05:21.024 04:55:41 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 71978 00:05:21.024 04:55:41 spdkcli_tcp -- common/autotest_common.sh@954 -- # '[' -z 71978 ']' 00:05:21.024 04:55:41 spdkcli_tcp -- common/autotest_common.sh@958 -- # kill -0 71978 00:05:21.024 04:55:41 spdkcli_tcp -- common/autotest_common.sh@959 -- # uname 00:05:21.024 04:55:41 spdkcli_tcp -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:21.024 04:55:41 spdkcli_tcp -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 71978 00:05:21.024 04:55:41 spdkcli_tcp -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:21.024 04:55:41 spdkcli_tcp -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:21.024 04:55:41 spdkcli_tcp -- common/autotest_common.sh@972 -- # echo 'killing process with pid 71978' 00:05:21.024 killing process with pid 71978 00:05:21.024 04:55:41 spdkcli_tcp -- common/autotest_common.sh@973 -- # kill 71978 00:05:21.024 04:55:41 spdkcli_tcp -- common/autotest_common.sh@978 -- # wait 71978 00:05:21.598 00:05:21.598 real 0m1.707s 00:05:21.598 user 0m3.059s 00:05:21.598 sys 0m0.426s 00:05:21.598 04:55:41 spdkcli_tcp -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:21.598 ************************************ 00:05:21.598 END TEST spdkcli_tcp 00:05:21.598 ************************************ 00:05:21.598 04:55:41 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:05:21.598 04:55:41 -- spdk/autotest.sh@167 -- # run_test dpdk_mem_utility /home/vagrant/spdk_repo/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:21.598 04:55:41 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:21.598 04:55:41 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:21.598 04:55:41 -- common/autotest_common.sh@10 -- # set +x 00:05:21.598 ************************************ 00:05:21.598 START TEST dpdk_mem_utility 00:05:21.598 ************************************ 00:05:21.598 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:05:21.598 * Looking for test storage... 00:05:21.598 * Found test storage at /home/vagrant/spdk_repo/spdk/test/dpdk_memory_utility 00:05:21.598 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:05:21.598 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@1711 -- # lcov --version 00:05:21.598 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:05:21.598 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:05:21.598 04:55:41 dpdk_mem_utility -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:21.598 04:55:41 dpdk_mem_utility -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@336 -- # IFS=.-: 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@336 -- # read -ra ver1 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@337 -- # IFS=.-: 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@337 -- # read -ra ver2 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@338 -- # local 'op=<' 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@340 -- # ver1_l=2 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@341 -- # ver2_l=1 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@344 -- # case "$op" in 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@345 -- # : 1 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@365 -- # decimal 1 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=1 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 1 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@365 -- # ver1[v]=1 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@366 -- # decimal 2 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@353 -- # local d=2 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@355 -- # echo 2 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@366 -- # ver2[v]=2 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:21.599 04:55:41 dpdk_mem_utility -- scripts/common.sh@368 -- # return 0 00:05:21.599 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:21.599 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:05:21.599 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:21.599 --rc genhtml_branch_coverage=1 00:05:21.599 --rc genhtml_function_coverage=1 00:05:21.599 --rc genhtml_legend=1 00:05:21.599 --rc geninfo_all_blocks=1 00:05:21.599 --rc geninfo_unexecuted_blocks=1 00:05:21.599 00:05:21.599 ' 00:05:21.599 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:05:21.599 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:21.599 --rc genhtml_branch_coverage=1 00:05:21.599 --rc genhtml_function_coverage=1 00:05:21.599 --rc genhtml_legend=1 00:05:21.599 --rc geninfo_all_blocks=1 00:05:21.599 --rc geninfo_unexecuted_blocks=1 00:05:21.599 00:05:21.599 ' 00:05:21.599 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:05:21.599 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:21.599 --rc genhtml_branch_coverage=1 00:05:21.599 --rc genhtml_function_coverage=1 00:05:21.599 --rc genhtml_legend=1 00:05:21.599 --rc geninfo_all_blocks=1 00:05:21.599 --rc geninfo_unexecuted_blocks=1 00:05:21.599 00:05:21.599 ' 00:05:21.599 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:05:21.599 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:21.599 --rc genhtml_branch_coverage=1 00:05:21.599 --rc genhtml_function_coverage=1 00:05:21.599 --rc genhtml_legend=1 00:05:21.599 --rc geninfo_all_blocks=1 00:05:21.599 --rc geninfo_unexecuted_blocks=1 00:05:21.599 00:05:21.599 ' 00:05:21.599 04:55:41 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/home/vagrant/spdk_repo/spdk/scripts/dpdk_mem_info.py 00:05:21.599 04:55:41 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=72072 00:05:21.599 04:55:41 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 72072 00:05:21.599 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:21.599 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@835 -- # '[' -z 72072 ']' 00:05:21.599 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:21.599 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:21.599 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:21.599 04:55:41 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:05:21.599 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:21.599 04:55:41 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:05:21.861 [2024-12-15 04:55:41.806149] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:21.861 [2024-12-15 04:55:41.806282] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid72072 ] 00:05:21.861 [2024-12-15 04:55:41.964649] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:21.861 [2024-12-15 04:55:41.993370] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:22.808 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:22.808 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@868 -- # return 0 00:05:22.808 04:55:42 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:05:22.808 04:55:42 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:05:22.808 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:22.808 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:05:22.808 { 00:05:22.808 "filename": "/tmp/spdk_mem_dump.txt" 00:05:22.808 } 00:05:22.808 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:22.808 04:55:42 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /home/vagrant/spdk_repo/spdk/scripts/dpdk_mem_info.py 00:05:22.808 DPDK memory size 818.000000 MiB in 1 heap(s) 00:05:22.808 1 heaps totaling size 818.000000 MiB 00:05:22.808 size: 818.000000 MiB heap id: 0 00:05:22.808 end heaps---------- 00:05:22.808 9 mempools totaling size 603.782043 MiB 00:05:22.808 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:05:22.808 size: 158.602051 MiB name: PDU_data_out_Pool 00:05:22.808 size: 100.555481 MiB name: bdev_io_72072 00:05:22.808 size: 50.003479 MiB name: msgpool_72072 00:05:22.808 size: 36.509338 MiB name: fsdev_io_72072 00:05:22.808 size: 21.763794 MiB name: PDU_Pool 00:05:22.808 size: 19.513306 MiB name: SCSI_TASK_Pool 00:05:22.808 size: 4.133484 MiB name: evtpool_72072 00:05:22.808 size: 0.026123 MiB name: Session_Pool 00:05:22.808 end mempools------- 00:05:22.808 6 memzones totaling size 4.142822 MiB 00:05:22.808 size: 1.000366 MiB name: RG_ring_0_72072 00:05:22.808 size: 1.000366 MiB name: RG_ring_1_72072 00:05:22.808 size: 1.000366 MiB name: RG_ring_4_72072 00:05:22.808 size: 1.000366 MiB name: RG_ring_5_72072 00:05:22.808 size: 0.125366 MiB name: RG_ring_2_72072 00:05:22.808 size: 0.015991 MiB name: RG_ring_3_72072 00:05:22.808 end memzones------- 00:05:22.808 04:55:42 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /home/vagrant/spdk_repo/spdk/scripts/dpdk_mem_info.py -m 0 00:05:22.808 heap id: 0 total size: 818.000000 MiB number of busy elements: 317 number of free elements: 15 00:05:22.808 list of free elements. size: 10.802490 MiB 00:05:22.808 element at address: 0x200019200000 with size: 0.999878 MiB 00:05:22.808 element at address: 0x200019400000 with size: 0.999878 MiB 00:05:22.808 element at address: 0x200032000000 with size: 0.994446 MiB 00:05:22.808 element at address: 0x200000400000 with size: 0.993958 MiB 00:05:22.808 element at address: 0x200006400000 with size: 0.959839 MiB 00:05:22.808 element at address: 0x200012c00000 with size: 0.944275 MiB 00:05:22.808 element at address: 0x200019600000 with size: 0.936584 MiB 00:05:22.808 element at address: 0x200000200000 with size: 0.717346 MiB 00:05:22.808 element at address: 0x20001ae00000 with size: 0.567688 MiB 00:05:22.808 element at address: 0x20000a600000 with size: 0.488892 MiB 00:05:22.808 element at address: 0x200000c00000 with size: 0.486267 MiB 00:05:22.808 element at address: 0x200019800000 with size: 0.485657 MiB 00:05:22.808 element at address: 0x200003e00000 with size: 0.480286 MiB 00:05:22.808 element at address: 0x200028200000 with size: 0.395752 MiB 00:05:22.808 element at address: 0x200000800000 with size: 0.351746 MiB 00:05:22.808 list of standard malloc elements. size: 199.268616 MiB 00:05:22.808 element at address: 0x20000a7fff80 with size: 132.000122 MiB 00:05:22.808 element at address: 0x2000065fff80 with size: 64.000122 MiB 00:05:22.808 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:05:22.808 element at address: 0x2000194fff80 with size: 1.000122 MiB 00:05:22.808 element at address: 0x2000196fff80 with size: 1.000122 MiB 00:05:22.808 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:05:22.808 element at address: 0x2000196eff00 with size: 0.062622 MiB 00:05:22.808 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:05:22.808 element at address: 0x2000196efdc0 with size: 0.000305 MiB 00:05:22.808 element at address: 0x2000002d7c40 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000003d9e40 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004fe740 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004fe800 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004fe8c0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004fe980 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004fea40 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004feb00 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004febc0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004fec80 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004fed40 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004fee00 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004feec0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004fef80 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ff040 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ff100 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ff1c0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ff280 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ff340 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ff400 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ff4c0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ff580 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ff640 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ff700 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ff7c0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ff880 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ff940 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ffa00 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ffac0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ffcc0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ffd80 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000004ffe40 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000085a0c0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000085a2c0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000085e580 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087e840 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087e900 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087e9c0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087ea80 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087eb40 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087ec00 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087ecc0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087ed80 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087ee40 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087ef00 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087efc0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087f080 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087f140 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087f200 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087f2c0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087f380 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087f440 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087f500 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087f5c0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x20000087f680 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000008ff940 with size: 0.000183 MiB 00:05:22.808 element at address: 0x2000008ffb40 with size: 0.000183 MiB 00:05:22.808 element at address: 0x200000c7c7c0 with size: 0.000183 MiB 00:05:22.808 element at address: 0x200000c7c880 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7c940 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7ca00 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7cac0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7cb80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7cc40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7cd00 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7cdc0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7ce80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7cf40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7d000 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7d0c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7d180 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7d240 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7d300 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7d3c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7d480 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7d540 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7d600 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7d6c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7d780 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7d840 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7d900 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7d9c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7da80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7db40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7dc00 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7dcc0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7dd80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7de40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7df00 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7dfc0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7e080 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7e140 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7e200 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7e2c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7e380 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7e440 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7e500 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7e5c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7e680 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7e740 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7e800 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7e8c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7e980 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7ea40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7eb00 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7ebc0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7ec80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000c7ed40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000cff000 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200000cff0c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200003e7af40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200003e7b000 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200003e7b0c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200003e7b180 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200003e7b240 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200003e7b300 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200003e7b3c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200003e7b480 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200003e7b540 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200003e7b600 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200003e7b6c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200003efb980 with size: 0.000183 MiB 00:05:22.809 element at address: 0x2000064fdd80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20000a67d280 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20000a67d340 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20000a67d400 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20000a67d4c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20000a67d580 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20000a67d640 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20000a67d700 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20000a67d7c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20000a67d880 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20000a67d940 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20000a67da00 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20000a67dac0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20000a6fdd80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x200012cf1bc0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x2000196efc40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x2000196efd00 with size: 0.000183 MiB 00:05:22.809 element at address: 0x2000198bc740 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae91540 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae91600 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae916c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae91780 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae91840 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae91900 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae919c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae91a80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae91b40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae91c00 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae91cc0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae91d80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae91e40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae91f00 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae91fc0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92080 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92140 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92200 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae922c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92380 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92440 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92500 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae925c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92680 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92740 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92800 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae928c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92980 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92a40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92b00 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92bc0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92c80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92d40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92e00 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92ec0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae92f80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93040 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93100 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae931c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93280 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93340 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93400 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae934c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93580 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93640 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93700 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae937c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93880 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93940 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93a00 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93ac0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93b80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93c40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93d00 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93dc0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93e80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae93f40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94000 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae940c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94180 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94240 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94300 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae943c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94480 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94540 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94600 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae946c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94780 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94840 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94900 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae949c0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94a80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94b40 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94c00 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94cc0 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94d80 with size: 0.000183 MiB 00:05:22.809 element at address: 0x20001ae94e40 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20001ae94f00 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20001ae94fc0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20001ae95080 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20001ae95140 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20001ae95200 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20001ae952c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20001ae95380 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20001ae95440 with size: 0.000183 MiB 00:05:22.810 element at address: 0x200028265500 with size: 0.000183 MiB 00:05:22.810 element at address: 0x2000282655c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826c1c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826c3c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826c480 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826c540 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826c600 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826c6c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826c780 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826c840 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826c900 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826c9c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826ca80 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826cb40 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826cc00 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826ccc0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826cd80 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826ce40 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826cf00 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826cfc0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826d080 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826d140 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826d200 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826d2c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826d380 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826d440 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826d500 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826d5c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826d680 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826d740 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826d800 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826d8c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826d980 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826da40 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826db00 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826dbc0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826dc80 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826dd40 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826de00 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826dec0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826df80 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826e040 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826e100 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826e1c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826e280 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826e340 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826e400 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826e4c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826e580 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826e640 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826e700 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826e7c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826e880 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826e940 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826ea00 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826eac0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826eb80 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826ec40 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826ed00 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826edc0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826ee80 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826ef40 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826f000 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826f0c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826f180 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826f240 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826f300 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826f3c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826f480 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826f540 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826f600 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826f6c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826f780 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826f840 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826f900 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826f9c0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826fa80 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826fb40 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826fc00 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826fcc0 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826fd80 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826fe40 with size: 0.000183 MiB 00:05:22.810 element at address: 0x20002826ff00 with size: 0.000183 MiB 00:05:22.810 list of memzone associated elements. size: 607.928894 MiB 00:05:22.810 element at address: 0x20001ae95500 with size: 211.416748 MiB 00:05:22.810 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:05:22.810 element at address: 0x20002826ffc0 with size: 157.562561 MiB 00:05:22.810 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:05:22.810 element at address: 0x200012df1e80 with size: 100.055054 MiB 00:05:22.810 associated memzone info: size: 100.054932 MiB name: MP_bdev_io_72072_0 00:05:22.810 element at address: 0x200000dff380 with size: 48.003052 MiB 00:05:22.810 associated memzone info: size: 48.002930 MiB name: MP_msgpool_72072_0 00:05:22.810 element at address: 0x200003ffdb80 with size: 36.008911 MiB 00:05:22.810 associated memzone info: size: 36.008789 MiB name: MP_fsdev_io_72072_0 00:05:22.810 element at address: 0x2000199be940 with size: 20.255554 MiB 00:05:22.810 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:05:22.810 element at address: 0x2000321feb40 with size: 18.005066 MiB 00:05:22.810 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:05:22.810 element at address: 0x2000004fff00 with size: 3.000244 MiB 00:05:22.810 associated memzone info: size: 3.000122 MiB name: MP_evtpool_72072_0 00:05:22.810 element at address: 0x2000009ffe00 with size: 2.000488 MiB 00:05:22.810 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_72072 00:05:22.810 element at address: 0x2000002d7d00 with size: 1.008118 MiB 00:05:22.810 associated memzone info: size: 1.007996 MiB name: MP_evtpool_72072 00:05:22.810 element at address: 0x20000a6fde40 with size: 1.008118 MiB 00:05:22.810 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:05:22.810 element at address: 0x2000198bc800 with size: 1.008118 MiB 00:05:22.810 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:05:22.810 element at address: 0x2000064fde40 with size: 1.008118 MiB 00:05:22.810 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:05:22.810 element at address: 0x200003efba40 with size: 1.008118 MiB 00:05:22.810 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:05:22.810 element at address: 0x200000cff180 with size: 1.000488 MiB 00:05:22.810 associated memzone info: size: 1.000366 MiB name: RG_ring_0_72072 00:05:22.810 element at address: 0x2000008ffc00 with size: 1.000488 MiB 00:05:22.810 associated memzone info: size: 1.000366 MiB name: RG_ring_1_72072 00:05:22.810 element at address: 0x200012cf1c80 with size: 1.000488 MiB 00:05:22.810 associated memzone info: size: 1.000366 MiB name: RG_ring_4_72072 00:05:22.810 element at address: 0x2000320fe940 with size: 1.000488 MiB 00:05:22.810 associated memzone info: size: 1.000366 MiB name: RG_ring_5_72072 00:05:22.810 element at address: 0x20000087f740 with size: 0.500488 MiB 00:05:22.810 associated memzone info: size: 0.500366 MiB name: RG_MP_fsdev_io_72072 00:05:22.810 element at address: 0x200000c7ee00 with size: 0.500488 MiB 00:05:22.810 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_72072 00:05:22.810 element at address: 0x20000a67db80 with size: 0.500488 MiB 00:05:22.810 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:05:22.810 element at address: 0x200003e7b780 with size: 0.500488 MiB 00:05:22.810 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:05:22.810 element at address: 0x20001987c540 with size: 0.250488 MiB 00:05:22.810 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:05:22.810 element at address: 0x2000002b7a40 with size: 0.125488 MiB 00:05:22.810 associated memzone info: size: 0.125366 MiB name: RG_MP_evtpool_72072 00:05:22.810 element at address: 0x20000085e640 with size: 0.125488 MiB 00:05:22.810 associated memzone info: size: 0.125366 MiB name: RG_ring_2_72072 00:05:22.810 element at address: 0x2000064f5b80 with size: 0.031738 MiB 00:05:22.810 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:05:22.810 element at address: 0x200028265680 with size: 0.023743 MiB 00:05:22.810 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:05:22.810 element at address: 0x20000085a380 with size: 0.016113 MiB 00:05:22.810 associated memzone info: size: 0.015991 MiB name: RG_ring_3_72072 00:05:22.810 element at address: 0x20002826b7c0 with size: 0.002441 MiB 00:05:22.810 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:05:22.810 element at address: 0x2000004ffb80 with size: 0.000305 MiB 00:05:22.810 associated memzone info: size: 0.000183 MiB name: MP_msgpool_72072 00:05:22.810 element at address: 0x2000008ffa00 with size: 0.000305 MiB 00:05:22.810 associated memzone info: size: 0.000183 MiB name: MP_fsdev_io_72072 00:05:22.810 element at address: 0x20000085a180 with size: 0.000305 MiB 00:05:22.811 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_72072 00:05:22.811 element at address: 0x20002826c280 with size: 0.000305 MiB 00:05:22.811 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:05:22.811 04:55:42 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:05:22.811 04:55:42 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 72072 00:05:22.811 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@954 -- # '[' -z 72072 ']' 00:05:22.811 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@958 -- # kill -0 72072 00:05:22.811 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@959 -- # uname 00:05:22.811 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:22.811 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 72072 00:05:22.811 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:22.811 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:22.811 killing process with pid 72072 00:05:22.811 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@972 -- # echo 'killing process with pid 72072' 00:05:22.811 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@973 -- # kill 72072 00:05:22.811 04:55:42 dpdk_mem_utility -- common/autotest_common.sh@978 -- # wait 72072 00:05:23.071 00:05:23.071 real 0m1.612s 00:05:23.071 user 0m1.651s 00:05:23.072 sys 0m0.470s 00:05:23.072 04:55:43 dpdk_mem_utility -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:23.072 ************************************ 00:05:23.072 END TEST dpdk_mem_utility 00:05:23.072 ************************************ 00:05:23.072 04:55:43 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:05:23.333 04:55:43 -- spdk/autotest.sh@168 -- # run_test event /home/vagrant/spdk_repo/spdk/test/event/event.sh 00:05:23.333 04:55:43 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:23.333 04:55:43 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:23.333 04:55:43 -- common/autotest_common.sh@10 -- # set +x 00:05:23.333 ************************************ 00:05:23.333 START TEST event 00:05:23.333 ************************************ 00:05:23.333 04:55:43 event -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/event/event.sh 00:05:23.333 * Looking for test storage... 00:05:23.333 * Found test storage at /home/vagrant/spdk_repo/spdk/test/event 00:05:23.333 04:55:43 event -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:05:23.333 04:55:43 event -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:05:23.333 04:55:43 event -- common/autotest_common.sh@1711 -- # lcov --version 00:05:23.333 04:55:43 event -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:05:23.333 04:55:43 event -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:23.333 04:55:43 event -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:23.333 04:55:43 event -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:23.333 04:55:43 event -- scripts/common.sh@336 -- # IFS=.-: 00:05:23.333 04:55:43 event -- scripts/common.sh@336 -- # read -ra ver1 00:05:23.333 04:55:43 event -- scripts/common.sh@337 -- # IFS=.-: 00:05:23.333 04:55:43 event -- scripts/common.sh@337 -- # read -ra ver2 00:05:23.333 04:55:43 event -- scripts/common.sh@338 -- # local 'op=<' 00:05:23.333 04:55:43 event -- scripts/common.sh@340 -- # ver1_l=2 00:05:23.333 04:55:43 event -- scripts/common.sh@341 -- # ver2_l=1 00:05:23.333 04:55:43 event -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:23.333 04:55:43 event -- scripts/common.sh@344 -- # case "$op" in 00:05:23.333 04:55:43 event -- scripts/common.sh@345 -- # : 1 00:05:23.333 04:55:43 event -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:23.333 04:55:43 event -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:23.333 04:55:43 event -- scripts/common.sh@365 -- # decimal 1 00:05:23.333 04:55:43 event -- scripts/common.sh@353 -- # local d=1 00:05:23.333 04:55:43 event -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:23.333 04:55:43 event -- scripts/common.sh@355 -- # echo 1 00:05:23.333 04:55:43 event -- scripts/common.sh@365 -- # ver1[v]=1 00:05:23.333 04:55:43 event -- scripts/common.sh@366 -- # decimal 2 00:05:23.333 04:55:43 event -- scripts/common.sh@353 -- # local d=2 00:05:23.333 04:55:43 event -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:23.333 04:55:43 event -- scripts/common.sh@355 -- # echo 2 00:05:23.333 04:55:43 event -- scripts/common.sh@366 -- # ver2[v]=2 00:05:23.333 04:55:43 event -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:23.333 04:55:43 event -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:23.333 04:55:43 event -- scripts/common.sh@368 -- # return 0 00:05:23.334 04:55:43 event -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:23.334 04:55:43 event -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:05:23.334 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:23.334 --rc genhtml_branch_coverage=1 00:05:23.334 --rc genhtml_function_coverage=1 00:05:23.334 --rc genhtml_legend=1 00:05:23.334 --rc geninfo_all_blocks=1 00:05:23.334 --rc geninfo_unexecuted_blocks=1 00:05:23.334 00:05:23.334 ' 00:05:23.334 04:55:43 event -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:05:23.334 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:23.334 --rc genhtml_branch_coverage=1 00:05:23.334 --rc genhtml_function_coverage=1 00:05:23.334 --rc genhtml_legend=1 00:05:23.334 --rc geninfo_all_blocks=1 00:05:23.334 --rc geninfo_unexecuted_blocks=1 00:05:23.334 00:05:23.334 ' 00:05:23.334 04:55:43 event -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:05:23.334 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:23.334 --rc genhtml_branch_coverage=1 00:05:23.334 --rc genhtml_function_coverage=1 00:05:23.334 --rc genhtml_legend=1 00:05:23.334 --rc geninfo_all_blocks=1 00:05:23.334 --rc geninfo_unexecuted_blocks=1 00:05:23.334 00:05:23.334 ' 00:05:23.334 04:55:43 event -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:05:23.334 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:23.334 --rc genhtml_branch_coverage=1 00:05:23.334 --rc genhtml_function_coverage=1 00:05:23.334 --rc genhtml_legend=1 00:05:23.334 --rc geninfo_all_blocks=1 00:05:23.334 --rc geninfo_unexecuted_blocks=1 00:05:23.334 00:05:23.334 ' 00:05:23.334 04:55:43 event -- event/event.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/bdev/nbd_common.sh 00:05:23.334 04:55:43 event -- bdev/nbd_common.sh@6 -- # set -e 00:05:23.334 04:55:43 event -- event/event.sh@45 -- # run_test event_perf /home/vagrant/spdk_repo/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:23.334 04:55:43 event -- common/autotest_common.sh@1105 -- # '[' 6 -le 1 ']' 00:05:23.334 04:55:43 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:23.334 04:55:43 event -- common/autotest_common.sh@10 -- # set +x 00:05:23.334 ************************************ 00:05:23.334 START TEST event_perf 00:05:23.334 ************************************ 00:05:23.334 04:55:43 event.event_perf -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:05:23.334 Running I/O for 1 seconds...[2024-12-15 04:55:43.446555] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:23.334 [2024-12-15 04:55:43.446673] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid72158 ] 00:05:23.594 [2024-12-15 04:55:43.607483] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:23.594 [2024-12-15 04:55:43.630010] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:05:23.594 [2024-12-15 04:55:43.630577] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 2 00:05:23.594 Running I/O for 1 seconds...[2024-12-15 04:55:43.630859] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:23.594 [2024-12-15 04:55:43.630964] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 3 00:05:24.534 00:05:24.534 lcore 0: 110276 00:05:24.534 lcore 1: 110275 00:05:24.534 lcore 2: 110276 00:05:24.534 lcore 3: 110273 00:05:24.796 done. 00:05:24.796 00:05:24.796 real 0m1.280s 00:05:24.796 user 0m4.063s 00:05:24.796 sys 0m0.090s 00:05:24.796 04:55:44 event.event_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:24.796 ************************************ 00:05:24.796 END TEST event_perf 00:05:24.796 ************************************ 00:05:24.796 04:55:44 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:05:24.796 04:55:44 event -- event/event.sh@46 -- # run_test event_reactor /home/vagrant/spdk_repo/spdk/test/event/reactor/reactor -t 1 00:05:24.796 04:55:44 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:05:24.796 04:55:44 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:24.796 04:55:44 event -- common/autotest_common.sh@10 -- # set +x 00:05:24.796 ************************************ 00:05:24.796 START TEST event_reactor 00:05:24.796 ************************************ 00:05:24.796 04:55:44 event.event_reactor -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/event/reactor/reactor -t 1 00:05:24.796 [2024-12-15 04:55:44.793838] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:24.796 [2024-12-15 04:55:44.793986] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid72192 ] 00:05:25.057 [2024-12-15 04:55:44.950990] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:25.057 [2024-12-15 04:55:44.979857] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:26.001 test_start 00:05:26.001 oneshot 00:05:26.001 tick 100 00:05:26.001 tick 100 00:05:26.001 tick 250 00:05:26.001 tick 100 00:05:26.001 tick 100 00:05:26.001 tick 100 00:05:26.001 tick 250 00:05:26.001 tick 500 00:05:26.001 tick 100 00:05:26.001 tick 100 00:05:26.001 tick 250 00:05:26.001 tick 100 00:05:26.001 tick 100 00:05:26.001 test_end 00:05:26.001 00:05:26.001 real 0m1.276s 00:05:26.001 user 0m1.092s 00:05:26.001 sys 0m0.075s 00:05:26.001 04:55:46 event.event_reactor -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:26.001 ************************************ 00:05:26.001 END TEST event_reactor 00:05:26.001 ************************************ 00:05:26.001 04:55:46 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:05:26.001 04:55:46 event -- event/event.sh@47 -- # run_test event_reactor_perf /home/vagrant/spdk_repo/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:26.001 04:55:46 event -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:05:26.001 04:55:46 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:26.001 04:55:46 event -- common/autotest_common.sh@10 -- # set +x 00:05:26.001 ************************************ 00:05:26.001 START TEST event_reactor_perf 00:05:26.001 ************************************ 00:05:26.001 04:55:46 event.event_reactor_perf -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/event/reactor_perf/reactor_perf -t 1 00:05:26.262 [2024-12-15 04:55:46.143560] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:26.263 [2024-12-15 04:55:46.143695] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid72229 ] 00:05:26.263 [2024-12-15 04:55:46.302650] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:26.263 [2024-12-15 04:55:46.323042] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:27.670 test_start 00:05:27.670 test_end 00:05:27.670 Performance: 308199 events per second 00:05:27.670 00:05:27.670 real 0m1.279s 00:05:27.670 user 0m1.098s 00:05:27.670 sys 0m0.072s 00:05:27.670 04:55:47 event.event_reactor_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:27.670 04:55:47 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:05:27.670 ************************************ 00:05:27.670 END TEST event_reactor_perf 00:05:27.670 ************************************ 00:05:27.670 04:55:47 event -- event/event.sh@49 -- # uname -s 00:05:27.670 04:55:47 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:05:27.670 04:55:47 event -- event/event.sh@50 -- # run_test event_scheduler /home/vagrant/spdk_repo/spdk/test/event/scheduler/scheduler.sh 00:05:27.670 04:55:47 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:27.670 04:55:47 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:27.670 04:55:47 event -- common/autotest_common.sh@10 -- # set +x 00:05:27.670 ************************************ 00:05:27.670 START TEST event_scheduler 00:05:27.670 ************************************ 00:05:27.670 04:55:47 event.event_scheduler -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/event/scheduler/scheduler.sh 00:05:27.670 * Looking for test storage... 00:05:27.670 * Found test storage at /home/vagrant/spdk_repo/spdk/test/event/scheduler 00:05:27.670 04:55:47 event.event_scheduler -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:05:27.670 04:55:47 event.event_scheduler -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:05:27.670 04:55:47 event.event_scheduler -- common/autotest_common.sh@1711 -- # lcov --version 00:05:27.670 04:55:47 event.event_scheduler -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:05:27.670 04:55:47 event.event_scheduler -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:27.670 04:55:47 event.event_scheduler -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:27.670 04:55:47 event.event_scheduler -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:27.670 04:55:47 event.event_scheduler -- scripts/common.sh@336 -- # IFS=.-: 00:05:27.670 04:55:47 event.event_scheduler -- scripts/common.sh@336 -- # read -ra ver1 00:05:27.670 04:55:47 event.event_scheduler -- scripts/common.sh@337 -- # IFS=.-: 00:05:27.670 04:55:47 event.event_scheduler -- scripts/common.sh@337 -- # read -ra ver2 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@338 -- # local 'op=<' 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@340 -- # ver1_l=2 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@341 -- # ver2_l=1 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@344 -- # case "$op" in 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@345 -- # : 1 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@365 -- # decimal 1 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@353 -- # local d=1 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@355 -- # echo 1 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@365 -- # ver1[v]=1 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@366 -- # decimal 2 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@353 -- # local d=2 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@355 -- # echo 2 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@366 -- # ver2[v]=2 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:27.671 04:55:47 event.event_scheduler -- scripts/common.sh@368 -- # return 0 00:05:27.671 04:55:47 event.event_scheduler -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:27.671 04:55:47 event.event_scheduler -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:05:27.671 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:27.671 --rc genhtml_branch_coverage=1 00:05:27.671 --rc genhtml_function_coverage=1 00:05:27.671 --rc genhtml_legend=1 00:05:27.671 --rc geninfo_all_blocks=1 00:05:27.671 --rc geninfo_unexecuted_blocks=1 00:05:27.671 00:05:27.671 ' 00:05:27.671 04:55:47 event.event_scheduler -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:05:27.671 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:27.671 --rc genhtml_branch_coverage=1 00:05:27.671 --rc genhtml_function_coverage=1 00:05:27.671 --rc genhtml_legend=1 00:05:27.671 --rc geninfo_all_blocks=1 00:05:27.671 --rc geninfo_unexecuted_blocks=1 00:05:27.671 00:05:27.671 ' 00:05:27.671 04:55:47 event.event_scheduler -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:05:27.671 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:27.671 --rc genhtml_branch_coverage=1 00:05:27.671 --rc genhtml_function_coverage=1 00:05:27.671 --rc genhtml_legend=1 00:05:27.671 --rc geninfo_all_blocks=1 00:05:27.671 --rc geninfo_unexecuted_blocks=1 00:05:27.671 00:05:27.671 ' 00:05:27.671 04:55:47 event.event_scheduler -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:05:27.671 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:27.671 --rc genhtml_branch_coverage=1 00:05:27.671 --rc genhtml_function_coverage=1 00:05:27.671 --rc genhtml_legend=1 00:05:27.671 --rc geninfo_all_blocks=1 00:05:27.671 --rc geninfo_unexecuted_blocks=1 00:05:27.671 00:05:27.671 ' 00:05:27.671 04:55:47 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:05:27.671 04:55:47 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=72299 00:05:27.671 04:55:47 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:05:27.671 04:55:47 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 72299 00:05:27.671 04:55:47 event.event_scheduler -- common/autotest_common.sh@835 -- # '[' -z 72299 ']' 00:05:27.671 04:55:47 event.event_scheduler -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:27.671 04:55:47 event.event_scheduler -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:27.671 04:55:47 event.event_scheduler -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:27.671 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:27.671 04:55:47 event.event_scheduler -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:27.671 04:55:47 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:05:27.671 04:55:47 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /home/vagrant/spdk_repo/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:05:27.671 [2024-12-15 04:55:47.666822] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:27.671 [2024-12-15 04:55:47.666936] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid72299 ] 00:05:27.933 [2024-12-15 04:55:47.824454] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:05:27.933 [2024-12-15 04:55:47.857270] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:27.933 [2024-12-15 04:55:47.857549] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:05:27.933 [2024-12-15 04:55:47.857809] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 3 00:05:27.933 [2024-12-15 04:55:47.857825] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 2 00:05:27.933 04:55:47 event.event_scheduler -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:27.933 04:55:47 event.event_scheduler -- common/autotest_common.sh@868 -- # return 0 00:05:27.933 04:55:47 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:05:27.933 04:55:47 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:27.933 04:55:47 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:05:27.933 POWER: failed to open /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:27.933 POWER: Cannot set governor of lcore 0 to userspace 00:05:27.933 POWER: failed to open /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:27.933 POWER: Cannot set governor of lcore 0 to performance 00:05:27.933 POWER: failed to open /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:27.933 POWER: Cannot set governor of lcore 0 to userspace 00:05:27.933 POWER: failed to open /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:05:27.933 POWER: Cannot set governor of lcore 0 to userspace 00:05:27.933 GUEST_CHANNEL: Unable to connect to '/dev/virtio-ports/virtio.serial.port.poweragent.0' with error No such file or directory 00:05:27.933 POWER: Unable to set Power Management Environment for lcore 0 00:05:27.933 [2024-12-15 04:55:47.899542] dpdk_governor.c: 135:_init_core: *ERROR*: Failed to initialize on core0 00:05:27.933 [2024-12-15 04:55:47.899563] dpdk_governor.c: 196:_init: *ERROR*: Failed to initialize on core0 00:05:27.933 [2024-12-15 04:55:47.899588] scheduler_dynamic.c: 280:init: *NOTICE*: Unable to initialize dpdk governor 00:05:27.933 [2024-12-15 04:55:47.899635] scheduler_dynamic.c: 427:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:05:27.933 [2024-12-15 04:55:47.899644] scheduler_dynamic.c: 429:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:05:27.933 [2024-12-15 04:55:47.899654] scheduler_dynamic.c: 431:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:05:27.933 04:55:47 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:27.933 04:55:47 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:05:27.933 04:55:47 event.event_scheduler -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:27.933 04:55:47 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:05:27.933 [2024-12-15 04:55:47.982793] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:05:27.933 04:55:47 event.event_scheduler -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:27.933 04:55:47 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:05:27.933 04:55:47 event.event_scheduler -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:27.933 04:55:47 event.event_scheduler -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:27.933 04:55:47 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:05:27.933 ************************************ 00:05:27.933 START TEST scheduler_create_thread 00:05:27.933 ************************************ 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1129 -- # scheduler_create_thread 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:27.934 2 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:27.934 3 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:27.934 4 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:27.934 5 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:27.934 6 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:27.934 7 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:27.934 8 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:27.934 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:28.194 9 00:05:28.194 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:28.194 04:55:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:05:28.194 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:28.194 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:28.194 10 00:05:28.195 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:28.195 04:55:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:05:28.195 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:28.195 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:28.195 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:28.195 04:55:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:05:28.195 04:55:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:05:28.195 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:28.195 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:28.195 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:28.195 04:55:48 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:05:28.195 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:28.195 04:55:48 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:29.582 04:55:49 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:29.582 04:55:49 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:05:29.582 04:55:49 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:05:29.582 04:55:49 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:29.582 04:55:49 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:30.522 ************************************ 00:05:30.522 END TEST scheduler_create_thread 00:05:30.522 ************************************ 00:05:30.522 04:55:50 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:30.522 00:05:30.522 real 0m2.613s 00:05:30.522 user 0m0.012s 00:05:30.522 sys 0m0.009s 00:05:30.522 04:55:50 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:30.522 04:55:50 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:05:30.522 04:55:50 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:05:30.522 04:55:50 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 72299 00:05:30.522 04:55:50 event.event_scheduler -- common/autotest_common.sh@954 -- # '[' -z 72299 ']' 00:05:30.522 04:55:50 event.event_scheduler -- common/autotest_common.sh@958 -- # kill -0 72299 00:05:30.522 04:55:50 event.event_scheduler -- common/autotest_common.sh@959 -- # uname 00:05:30.522 04:55:50 event.event_scheduler -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:30.522 04:55:50 event.event_scheduler -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 72299 00:05:30.784 killing process with pid 72299 00:05:30.784 04:55:50 event.event_scheduler -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:05:30.784 04:55:50 event.event_scheduler -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:05:30.784 04:55:50 event.event_scheduler -- common/autotest_common.sh@972 -- # echo 'killing process with pid 72299' 00:05:30.784 04:55:50 event.event_scheduler -- common/autotest_common.sh@973 -- # kill 72299 00:05:30.784 04:55:50 event.event_scheduler -- common/autotest_common.sh@978 -- # wait 72299 00:05:31.045 [2024-12-15 04:55:51.094268] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:05:31.307 00:05:31.307 real 0m3.773s 00:05:31.307 user 0m5.452s 00:05:31.307 sys 0m0.366s 00:05:31.307 04:55:51 event.event_scheduler -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:31.307 ************************************ 00:05:31.307 END TEST event_scheduler 00:05:31.307 ************************************ 00:05:31.307 04:55:51 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:05:31.307 04:55:51 event -- event/event.sh@51 -- # modprobe -n nbd 00:05:31.307 04:55:51 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:05:31.307 04:55:51 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:31.307 04:55:51 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:31.307 04:55:51 event -- common/autotest_common.sh@10 -- # set +x 00:05:31.307 ************************************ 00:05:31.307 START TEST app_repeat 00:05:31.307 ************************************ 00:05:31.307 04:55:51 event.app_repeat -- common/autotest_common.sh@1129 -- # app_repeat_test 00:05:31.307 04:55:51 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:31.307 04:55:51 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:31.307 04:55:51 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:05:31.307 04:55:51 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:31.307 04:55:51 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:05:31.307 04:55:51 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:05:31.307 04:55:51 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:05:31.307 Process app_repeat pid: 72387 00:05:31.307 spdk_app_start Round 0 00:05:31.307 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:31.307 04:55:51 event.app_repeat -- event/event.sh@19 -- # repeat_pid=72387 00:05:31.307 04:55:51 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:05:31.307 04:55:51 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 72387' 00:05:31.307 04:55:51 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:05:31.307 04:55:51 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:05:31.307 04:55:51 event.app_repeat -- event/event.sh@25 -- # waitforlisten 72387 /var/tmp/spdk-nbd.sock 00:05:31.307 04:55:51 event.app_repeat -- event/event.sh@18 -- # /home/vagrant/spdk_repo/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:05:31.307 04:55:51 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 72387 ']' 00:05:31.307 04:55:51 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:31.307 04:55:51 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:31.307 04:55:51 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:31.307 04:55:51 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:31.307 04:55:51 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:05:31.307 [2024-12-15 04:55:51.345431] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:31.307 [2024-12-15 04:55:51.345548] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid72387 ] 00:05:31.569 [2024-12-15 04:55:51.504342] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:31.569 [2024-12-15 04:55:51.535119] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:05:31.569 [2024-12-15 04:55:51.535168] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:32.142 04:55:52 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:32.142 04:55:52 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:05:32.142 04:55:52 event.app_repeat -- event/event.sh@27 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:32.403 Malloc0 00:05:32.403 04:55:52 event.app_repeat -- event/event.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:32.664 Malloc1 00:05:32.664 04:55:52 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:32.664 04:55:52 event.app_repeat -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:32.926 /dev/nbd0 00:05:32.926 04:55:52 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:32.926 04:55:52 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:32.926 04:55:52 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:05:32.926 04:55:52 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:05:32.926 04:55:52 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:05:32.926 04:55:52 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:05:32.926 04:55:52 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:05:32.926 04:55:52 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:05:32.926 04:55:52 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:05:32.926 04:55:52 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:05:32.926 04:55:52 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:32.926 1+0 records in 00:05:32.926 1+0 records out 00:05:32.926 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000237719 s, 17.2 MB/s 00:05:32.926 04:55:52 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:05:32.926 04:55:52 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:05:32.926 04:55:52 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:05:32.926 04:55:52 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:05:32.926 04:55:52 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:05:32.926 04:55:52 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:32.926 04:55:52 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:32.926 04:55:52 event.app_repeat -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:33.186 /dev/nbd1 00:05:33.186 04:55:53 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:33.186 04:55:53 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:33.186 04:55:53 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:05:33.186 04:55:53 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:05:33.186 04:55:53 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:05:33.186 04:55:53 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:05:33.186 04:55:53 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:05:33.186 04:55:53 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:05:33.186 04:55:53 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:05:33.186 04:55:53 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:05:33.186 04:55:53 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:33.186 1+0 records in 00:05:33.186 1+0 records out 00:05:33.186 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265424 s, 15.4 MB/s 00:05:33.186 04:55:53 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:05:33.186 04:55:53 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:05:33.186 04:55:53 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:05:33.186 04:55:53 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:05:33.186 04:55:53 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:05:33.186 04:55:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:33.186 04:55:53 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:33.186 04:55:53 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:33.186 04:55:53 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:33.186 04:55:53 event.app_repeat -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:33.448 { 00:05:33.448 "nbd_device": "/dev/nbd0", 00:05:33.448 "bdev_name": "Malloc0" 00:05:33.448 }, 00:05:33.448 { 00:05:33.448 "nbd_device": "/dev/nbd1", 00:05:33.448 "bdev_name": "Malloc1" 00:05:33.448 } 00:05:33.448 ]' 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:33.448 { 00:05:33.448 "nbd_device": "/dev/nbd0", 00:05:33.448 "bdev_name": "Malloc0" 00:05:33.448 }, 00:05:33.448 { 00:05:33.448 "nbd_device": "/dev/nbd1", 00:05:33.448 "bdev_name": "Malloc1" 00:05:33.448 } 00:05:33.448 ]' 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:33.448 /dev/nbd1' 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:33.448 /dev/nbd1' 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:33.448 256+0 records in 00:05:33.448 256+0 records out 00:05:33.448 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0111842 s, 93.8 MB/s 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:33.448 256+0 records in 00:05:33.448 256+0 records out 00:05:33.448 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.017189 s, 61.0 MB/s 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:33.448 256+0 records in 00:05:33.448 256+0 records out 00:05:33.448 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0242805 s, 43.2 MB/s 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest /dev/nbd0 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest /dev/nbd1 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:05:33.448 04:55:53 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:33.449 04:55:53 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:33.449 04:55:53 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:33.449 04:55:53 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:33.449 04:55:53 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:05:33.449 04:55:53 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:33.449 04:55:53 event.app_repeat -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:33.710 04:55:53 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:33.710 04:55:53 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:33.710 04:55:53 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:33.710 04:55:53 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:33.710 04:55:53 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:33.710 04:55:53 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:33.710 04:55:53 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:05:33.710 04:55:53 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:05:33.710 04:55:53 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:33.710 04:55:53 event.app_repeat -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:33.971 04:55:53 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:33.971 04:55:53 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:33.971 04:55:53 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:33.971 04:55:53 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:33.971 04:55:53 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:33.971 04:55:53 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:33.971 04:55:53 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:05:33.971 04:55:53 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:05:33.971 04:55:53 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:33.971 04:55:53 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:33.971 04:55:53 event.app_repeat -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:34.232 04:55:54 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:34.232 04:55:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:34.232 04:55:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:34.232 04:55:54 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:34.232 04:55:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:05:34.232 04:55:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:34.232 04:55:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:05:34.232 04:55:54 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:05:34.232 04:55:54 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:05:34.232 04:55:54 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:05:34.232 04:55:54 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:34.232 04:55:54 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:05:34.233 04:55:54 event.app_repeat -- event/event.sh@34 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:34.494 04:55:54 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:05:34.494 [2024-12-15 04:55:54.464629] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:34.494 [2024-12-15 04:55:54.481652] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:05:34.494 [2024-12-15 04:55:54.481736] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:34.494 [2024-12-15 04:55:54.512508] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:34.494 [2024-12-15 04:55:54.512567] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:37.796 spdk_app_start Round 1 00:05:37.796 04:55:57 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:05:37.796 04:55:57 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:05:37.796 04:55:57 event.app_repeat -- event/event.sh@25 -- # waitforlisten 72387 /var/tmp/spdk-nbd.sock 00:05:37.796 04:55:57 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 72387 ']' 00:05:37.796 04:55:57 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:37.796 04:55:57 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:37.796 04:55:57 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:37.796 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:37.796 04:55:57 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:37.796 04:55:57 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:05:37.796 04:55:57 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:37.796 04:55:57 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:05:37.796 04:55:57 event.app_repeat -- event/event.sh@27 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:37.796 Malloc0 00:05:37.796 04:55:57 event.app_repeat -- event/event.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:38.054 Malloc1 00:05:38.054 04:55:58 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:38.054 04:55:58 event.app_repeat -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:38.312 /dev/nbd0 00:05:38.312 04:55:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:38.312 04:55:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:38.312 04:55:58 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:05:38.312 04:55:58 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:05:38.312 04:55:58 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:05:38.312 04:55:58 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:05:38.312 04:55:58 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:05:38.312 04:55:58 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:05:38.312 04:55:58 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:05:38.312 04:55:58 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:05:38.312 04:55:58 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:38.312 1+0 records in 00:05:38.312 1+0 records out 00:05:38.312 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000189715 s, 21.6 MB/s 00:05:38.312 04:55:58 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:05:38.312 04:55:58 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:05:38.312 04:55:58 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:05:38.312 04:55:58 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:05:38.312 04:55:58 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:05:38.312 04:55:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:38.312 04:55:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:38.312 04:55:58 event.app_repeat -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:38.312 /dev/nbd1 00:05:38.571 04:55:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:38.571 04:55:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:38.571 04:55:58 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:05:38.571 04:55:58 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:05:38.571 04:55:58 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:05:38.571 04:55:58 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:05:38.571 04:55:58 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:05:38.571 04:55:58 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:05:38.571 04:55:58 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:05:38.571 04:55:58 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:05:38.571 04:55:58 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:38.571 1+0 records in 00:05:38.571 1+0 records out 00:05:38.571 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000300961 s, 13.6 MB/s 00:05:38.571 04:55:58 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:05:38.571 04:55:58 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:05:38.571 04:55:58 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:05:38.571 04:55:58 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:05:38.571 04:55:58 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:05:38.571 04:55:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:38.571 04:55:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:38.571 04:55:58 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:38.571 04:55:58 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:38.571 04:55:58 event.app_repeat -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:38.571 04:55:58 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:38.571 { 00:05:38.571 "nbd_device": "/dev/nbd0", 00:05:38.571 "bdev_name": "Malloc0" 00:05:38.571 }, 00:05:38.571 { 00:05:38.571 "nbd_device": "/dev/nbd1", 00:05:38.571 "bdev_name": "Malloc1" 00:05:38.571 } 00:05:38.571 ]' 00:05:38.571 04:55:58 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:38.571 04:55:58 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:38.571 { 00:05:38.571 "nbd_device": "/dev/nbd0", 00:05:38.571 "bdev_name": "Malloc0" 00:05:38.571 }, 00:05:38.571 { 00:05:38.571 "nbd_device": "/dev/nbd1", 00:05:38.571 "bdev_name": "Malloc1" 00:05:38.571 } 00:05:38.571 ]' 00:05:38.571 04:55:58 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:38.571 /dev/nbd1' 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:38.829 /dev/nbd1' 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:38.829 256+0 records in 00:05:38.829 256+0 records out 00:05:38.829 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00694033 s, 151 MB/s 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:38.829 256+0 records in 00:05:38.829 256+0 records out 00:05:38.829 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0144777 s, 72.4 MB/s 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:38.829 256+0 records in 00:05:38.829 256+0 records out 00:05:38.829 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0186875 s, 56.1 MB/s 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest /dev/nbd0 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest /dev/nbd1 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:38.829 04:55:58 event.app_repeat -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:39.087 04:55:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:39.345 04:55:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:39.345 04:55:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:39.345 04:55:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:39.345 04:55:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:39.345 04:55:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:05:39.345 04:55:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:39.345 04:55:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:05:39.345 04:55:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:05:39.345 04:55:59 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:05:39.345 04:55:59 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:05:39.345 04:55:59 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:39.345 04:55:59 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:05:39.345 04:55:59 event.app_repeat -- event/event.sh@34 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:39.604 04:55:59 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:05:39.604 [2024-12-15 04:55:59.739921] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:39.863 [2024-12-15 04:55:59.755274] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:05:39.863 [2024-12-15 04:55:59.755480] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:39.863 [2024-12-15 04:55:59.791877] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:39.863 [2024-12-15 04:55:59.791928] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:43.143 spdk_app_start Round 2 00:05:43.143 04:56:02 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:05:43.143 04:56:02 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:05:43.143 04:56:02 event.app_repeat -- event/event.sh@25 -- # waitforlisten 72387 /var/tmp/spdk-nbd.sock 00:05:43.144 04:56:02 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 72387 ']' 00:05:43.144 04:56:02 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:43.144 04:56:02 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:43.144 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:43.144 04:56:02 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:43.144 04:56:02 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:43.144 04:56:02 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:05:43.144 04:56:02 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:43.144 04:56:02 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:05:43.144 04:56:02 event.app_repeat -- event/event.sh@27 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:43.144 Malloc0 00:05:43.144 04:56:03 event.app_repeat -- event/event.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:05:43.401 Malloc1 00:05:43.401 04:56:03 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:43.401 04:56:03 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:43.401 04:56:03 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:43.401 04:56:03 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:05:43.401 04:56:03 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:43.401 04:56:03 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:05:43.401 04:56:03 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:05:43.401 04:56:03 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:43.401 04:56:03 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:05:43.401 04:56:03 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:05:43.401 04:56:03 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:43.401 04:56:03 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:05:43.401 04:56:03 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:05:43.402 04:56:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:05:43.402 04:56:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:43.402 04:56:03 event.app_repeat -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:05:43.402 /dev/nbd0 00:05:43.402 04:56:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:05:43.402 04:56:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:05:43.402 04:56:03 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:05:43.402 04:56:03 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:05:43.402 04:56:03 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:05:43.402 04:56:03 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:05:43.402 04:56:03 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:05:43.402 04:56:03 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:05:43.402 04:56:03 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:05:43.402 04:56:03 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:05:43.402 04:56:03 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:43.402 1+0 records in 00:05:43.402 1+0 records out 00:05:43.402 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000479668 s, 8.5 MB/s 00:05:43.402 04:56:03 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:05:43.402 04:56:03 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:05:43.402 04:56:03 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:05:43.402 04:56:03 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:05:43.402 04:56:03 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:05:43.402 04:56:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:43.402 04:56:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:43.402 04:56:03 event.app_repeat -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:05:43.736 /dev/nbd1 00:05:43.736 04:56:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:05:43.736 04:56:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:05:43.736 04:56:03 event.app_repeat -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:05:43.736 04:56:03 event.app_repeat -- common/autotest_common.sh@873 -- # local i 00:05:43.736 04:56:03 event.app_repeat -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:05:43.736 04:56:03 event.app_repeat -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:05:43.736 04:56:03 event.app_repeat -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:05:43.736 04:56:03 event.app_repeat -- common/autotest_common.sh@877 -- # break 00:05:43.736 04:56:03 event.app_repeat -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:05:43.736 04:56:03 event.app_repeat -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:05:43.736 04:56:03 event.app_repeat -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:05:43.736 1+0 records in 00:05:43.736 1+0 records out 00:05:43.736 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000180865 s, 22.6 MB/s 00:05:43.736 04:56:03 event.app_repeat -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:05:43.736 04:56:03 event.app_repeat -- common/autotest_common.sh@890 -- # size=4096 00:05:43.736 04:56:03 event.app_repeat -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/event/nbdtest 00:05:43.736 04:56:03 event.app_repeat -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:05:43.736 04:56:03 event.app_repeat -- common/autotest_common.sh@893 -- # return 0 00:05:43.736 04:56:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:05:43.736 04:56:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:05:43.736 04:56:03 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:43.736 04:56:03 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:43.736 04:56:03 event.app_repeat -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:44.001 04:56:03 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:05:44.002 { 00:05:44.002 "nbd_device": "/dev/nbd0", 00:05:44.002 "bdev_name": "Malloc0" 00:05:44.002 }, 00:05:44.002 { 00:05:44.002 "nbd_device": "/dev/nbd1", 00:05:44.002 "bdev_name": "Malloc1" 00:05:44.002 } 00:05:44.002 ]' 00:05:44.002 04:56:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:05:44.002 { 00:05:44.002 "nbd_device": "/dev/nbd0", 00:05:44.002 "bdev_name": "Malloc0" 00:05:44.002 }, 00:05:44.002 { 00:05:44.002 "nbd_device": "/dev/nbd1", 00:05:44.002 "bdev_name": "Malloc1" 00:05:44.002 } 00:05:44.002 ]' 00:05:44.002 04:56:03 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:05:44.002 /dev/nbd1' 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:05:44.002 /dev/nbd1' 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest bs=4096 count=256 00:05:44.002 256+0 records in 00:05:44.002 256+0 records out 00:05:44.002 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00753902 s, 139 MB/s 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:05:44.002 256+0 records in 00:05:44.002 256+0 records out 00:05:44.002 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0182444 s, 57.5 MB/s 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:05:44.002 256+0 records in 00:05:44.002 256+0 records out 00:05:44.002 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0158177 s, 66.3 MB/s 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest /dev/nbd0 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest /dev/nbd1 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /home/vagrant/spdk_repo/spdk/test/event/nbdrandtest 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:44.002 04:56:04 event.app_repeat -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:05:44.260 04:56:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:05:44.260 04:56:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:05:44.260 04:56:04 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:05:44.260 04:56:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:44.260 04:56:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:44.260 04:56:04 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:05:44.260 04:56:04 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:05:44.260 04:56:04 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:05:44.260 04:56:04 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:05:44.260 04:56:04 event.app_repeat -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:05:44.518 04:56:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:05:44.518 04:56:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:05:44.518 04:56:04 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:05:44.518 04:56:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:05:44.518 04:56:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:05:44.518 04:56:04 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:05:44.518 04:56:04 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:05:44.518 04:56:04 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:05:44.518 04:56:04 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:05:44.518 04:56:04 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:05:44.518 04:56:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:05:44.776 04:56:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:05:44.776 04:56:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:05:44.776 04:56:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:05:44.776 04:56:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:05:44.776 04:56:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:05:44.776 04:56:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:05:44.776 04:56:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:05:44.776 04:56:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:05:44.776 04:56:04 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:05:44.776 04:56:04 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:05:44.776 04:56:04 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:05:44.776 04:56:04 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:05:44.776 04:56:04 event.app_repeat -- event/event.sh@34 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:05:45.033 04:56:04 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:05:45.033 [2024-12-15 04:56:05.029627] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:05:45.033 [2024-12-15 04:56:05.046309] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:05:45.033 [2024-12-15 04:56:05.046421] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:45.033 [2024-12-15 04:56:05.077303] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:05:45.033 [2024-12-15 04:56:05.077350] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:05:48.315 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:05:48.315 04:56:07 event.app_repeat -- event/event.sh@38 -- # waitforlisten 72387 /var/tmp/spdk-nbd.sock 00:05:48.315 04:56:07 event.app_repeat -- common/autotest_common.sh@835 -- # '[' -z 72387 ']' 00:05:48.315 04:56:07 event.app_repeat -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:05:48.315 04:56:07 event.app_repeat -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:48.315 04:56:07 event.app_repeat -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:05:48.315 04:56:07 event.app_repeat -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:48.315 04:56:07 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:05:48.315 04:56:08 event.app_repeat -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:48.315 04:56:08 event.app_repeat -- common/autotest_common.sh@868 -- # return 0 00:05:48.315 04:56:08 event.app_repeat -- event/event.sh@39 -- # killprocess 72387 00:05:48.315 04:56:08 event.app_repeat -- common/autotest_common.sh@954 -- # '[' -z 72387 ']' 00:05:48.315 04:56:08 event.app_repeat -- common/autotest_common.sh@958 -- # kill -0 72387 00:05:48.315 04:56:08 event.app_repeat -- common/autotest_common.sh@959 -- # uname 00:05:48.315 04:56:08 event.app_repeat -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:48.315 04:56:08 event.app_repeat -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 72387 00:05:48.315 killing process with pid 72387 00:05:48.315 04:56:08 event.app_repeat -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:48.315 04:56:08 event.app_repeat -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:48.315 04:56:08 event.app_repeat -- common/autotest_common.sh@972 -- # echo 'killing process with pid 72387' 00:05:48.315 04:56:08 event.app_repeat -- common/autotest_common.sh@973 -- # kill 72387 00:05:48.315 04:56:08 event.app_repeat -- common/autotest_common.sh@978 -- # wait 72387 00:05:48.315 spdk_app_start is called in Round 0. 00:05:48.315 Shutdown signal received, stop current app iteration 00:05:48.315 Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 reinitialization... 00:05:48.315 spdk_app_start is called in Round 1. 00:05:48.315 Shutdown signal received, stop current app iteration 00:05:48.315 Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 reinitialization... 00:05:48.315 spdk_app_start is called in Round 2. 00:05:48.315 Shutdown signal received, stop current app iteration 00:05:48.315 Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 reinitialization... 00:05:48.315 spdk_app_start is called in Round 3. 00:05:48.315 Shutdown signal received, stop current app iteration 00:05:48.315 ************************************ 00:05:48.315 END TEST app_repeat 00:05:48.315 ************************************ 00:05:48.315 04:56:08 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:05:48.315 04:56:08 event.app_repeat -- event/event.sh@42 -- # return 0 00:05:48.315 00:05:48.315 real 0m16.996s 00:05:48.315 user 0m38.029s 00:05:48.315 sys 0m2.105s 00:05:48.315 04:56:08 event.app_repeat -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:48.315 04:56:08 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:05:48.315 04:56:08 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:05:48.315 04:56:08 event -- event/event.sh@55 -- # run_test cpu_locks /home/vagrant/spdk_repo/spdk/test/event/cpu_locks.sh 00:05:48.315 04:56:08 event -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:48.315 04:56:08 event -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:48.315 04:56:08 event -- common/autotest_common.sh@10 -- # set +x 00:05:48.315 ************************************ 00:05:48.315 START TEST cpu_locks 00:05:48.315 ************************************ 00:05:48.315 04:56:08 event.cpu_locks -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/event/cpu_locks.sh 00:05:48.315 * Looking for test storage... 00:05:48.315 * Found test storage at /home/vagrant/spdk_repo/spdk/test/event 00:05:48.315 04:56:08 event.cpu_locks -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:05:48.315 04:56:08 event.cpu_locks -- common/autotest_common.sh@1711 -- # lcov --version 00:05:48.315 04:56:08 event.cpu_locks -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:05:48.575 04:56:08 event.cpu_locks -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@333 -- # local ver1 ver1_l 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@334 -- # local ver2 ver2_l 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@336 -- # IFS=.-: 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@336 -- # read -ra ver1 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@337 -- # IFS=.-: 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@337 -- # read -ra ver2 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@338 -- # local 'op=<' 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@340 -- # ver1_l=2 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@341 -- # ver2_l=1 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@344 -- # case "$op" in 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@345 -- # : 1 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@364 -- # (( v = 0 )) 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@365 -- # decimal 1 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@353 -- # local d=1 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@355 -- # echo 1 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@365 -- # ver1[v]=1 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@366 -- # decimal 2 00:05:48.575 04:56:08 event.cpu_locks -- scripts/common.sh@353 -- # local d=2 00:05:48.576 04:56:08 event.cpu_locks -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:05:48.576 04:56:08 event.cpu_locks -- scripts/common.sh@355 -- # echo 2 00:05:48.576 04:56:08 event.cpu_locks -- scripts/common.sh@366 -- # ver2[v]=2 00:05:48.576 04:56:08 event.cpu_locks -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:05:48.576 04:56:08 event.cpu_locks -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:05:48.576 04:56:08 event.cpu_locks -- scripts/common.sh@368 -- # return 0 00:05:48.576 04:56:08 event.cpu_locks -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:05:48.576 04:56:08 event.cpu_locks -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:05:48.576 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:48.576 --rc genhtml_branch_coverage=1 00:05:48.576 --rc genhtml_function_coverage=1 00:05:48.576 --rc genhtml_legend=1 00:05:48.576 --rc geninfo_all_blocks=1 00:05:48.576 --rc geninfo_unexecuted_blocks=1 00:05:48.576 00:05:48.576 ' 00:05:48.576 04:56:08 event.cpu_locks -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:05:48.576 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:48.576 --rc genhtml_branch_coverage=1 00:05:48.576 --rc genhtml_function_coverage=1 00:05:48.576 --rc genhtml_legend=1 00:05:48.576 --rc geninfo_all_blocks=1 00:05:48.576 --rc geninfo_unexecuted_blocks=1 00:05:48.576 00:05:48.576 ' 00:05:48.576 04:56:08 event.cpu_locks -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:05:48.576 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:48.576 --rc genhtml_branch_coverage=1 00:05:48.576 --rc genhtml_function_coverage=1 00:05:48.576 --rc genhtml_legend=1 00:05:48.576 --rc geninfo_all_blocks=1 00:05:48.576 --rc geninfo_unexecuted_blocks=1 00:05:48.576 00:05:48.576 ' 00:05:48.576 04:56:08 event.cpu_locks -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:05:48.576 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:05:48.576 --rc genhtml_branch_coverage=1 00:05:48.576 --rc genhtml_function_coverage=1 00:05:48.576 --rc genhtml_legend=1 00:05:48.576 --rc geninfo_all_blocks=1 00:05:48.576 --rc geninfo_unexecuted_blocks=1 00:05:48.576 00:05:48.576 ' 00:05:48.576 04:56:08 event.cpu_locks -- event/cpu_locks.sh@11 -- # rpc_sock1=/var/tmp/spdk.sock 00:05:48.576 04:56:08 event.cpu_locks -- event/cpu_locks.sh@12 -- # rpc_sock2=/var/tmp/spdk2.sock 00:05:48.576 04:56:08 event.cpu_locks -- event/cpu_locks.sh@164 -- # trap cleanup EXIT SIGTERM SIGINT 00:05:48.576 04:56:08 event.cpu_locks -- event/cpu_locks.sh@166 -- # run_test default_locks default_locks 00:05:48.576 04:56:08 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:48.576 04:56:08 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:48.576 04:56:08 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:05:48.576 ************************************ 00:05:48.576 START TEST default_locks 00:05:48.576 ************************************ 00:05:48.576 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:48.576 04:56:08 event.cpu_locks.default_locks -- common/autotest_common.sh@1129 -- # default_locks 00:05:48.576 04:56:08 event.cpu_locks.default_locks -- event/cpu_locks.sh@46 -- # spdk_tgt_pid=72812 00:05:48.576 04:56:08 event.cpu_locks.default_locks -- event/cpu_locks.sh@47 -- # waitforlisten 72812 00:05:48.576 04:56:08 event.cpu_locks.default_locks -- event/cpu_locks.sh@45 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 00:05:48.576 04:56:08 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 72812 ']' 00:05:48.576 04:56:08 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:48.576 04:56:08 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:48.576 04:56:08 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:48.576 04:56:08 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:48.576 04:56:08 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:05:48.576 [2024-12-15 04:56:08.574551] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:48.576 [2024-12-15 04:56:08.574758] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid72812 ] 00:05:48.835 [2024-12-15 04:56:08.726239] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:48.835 [2024-12-15 04:56:08.743868] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:49.406 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:49.406 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 0 00:05:49.406 04:56:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@49 -- # locks_exist 72812 00:05:49.406 04:56:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # lslocks -p 72812 00:05:49.406 04:56:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:49.406 04:56:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@50 -- # killprocess 72812 00:05:49.406 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@954 -- # '[' -z 72812 ']' 00:05:49.406 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@958 -- # kill -0 72812 00:05:49.406 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # uname 00:05:49.406 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:49.406 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 72812 00:05:49.668 killing process with pid 72812 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 72812' 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@973 -- # kill 72812 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@978 -- # wait 72812 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@52 -- # NOT waitforlisten 72812 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@652 -- # local es=0 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 72812 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # waitforlisten 72812 00:05:49.668 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:49.668 ERROR: process (pid: 72812) is no longer running 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@835 -- # '[' -z 72812 ']' 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:05:49.668 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 850: kill: (72812) - No such process 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@868 -- # return 1 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@655 -- # es=1 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@54 -- # no_locks 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@26 -- # local lock_files 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:49.668 00:05:49.668 real 0m1.290s 00:05:49.668 user 0m1.309s 00:05:49.668 sys 0m0.372s 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:49.668 ************************************ 00:05:49.668 END TEST default_locks 00:05:49.668 04:56:09 event.cpu_locks.default_locks -- common/autotest_common.sh@10 -- # set +x 00:05:49.668 ************************************ 00:05:49.929 04:56:09 event.cpu_locks -- event/cpu_locks.sh@167 -- # run_test default_locks_via_rpc default_locks_via_rpc 00:05:49.929 04:56:09 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:49.929 04:56:09 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:49.929 04:56:09 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:05:49.929 ************************************ 00:05:49.929 START TEST default_locks_via_rpc 00:05:49.929 ************************************ 00:05:49.929 04:56:09 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1129 -- # default_locks_via_rpc 00:05:49.930 04:56:09 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@62 -- # spdk_tgt_pid=72854 00:05:49.930 04:56:09 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@63 -- # waitforlisten 72854 00:05:49.930 04:56:09 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 72854 ']' 00:05:49.930 04:56:09 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:49.930 04:56:09 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:49.930 04:56:09 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@61 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 00:05:49.930 04:56:09 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:49.930 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:49.930 04:56:09 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:49.930 04:56:09 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:49.930 [2024-12-15 04:56:09.926539] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:49.930 [2024-12-15 04:56:09.926677] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid72854 ] 00:05:50.191 [2024-12-15 04:56:10.082668] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:50.191 [2024-12-15 04:56:10.106804] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@65 -- # rpc_cmd framework_disable_cpumask_locks 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@67 -- # no_locks 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # lock_files=() 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@26 -- # local lock_files 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@27 -- # (( 0 != 0 )) 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@69 -- # rpc_cmd framework_enable_cpumask_locks 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@71 -- # locks_exist 72854 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # lslocks -p 72854 00:05:50.762 04:56:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:51.021 04:56:10 event.cpu_locks.default_locks_via_rpc -- event/cpu_locks.sh@73 -- # killprocess 72854 00:05:51.021 04:56:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@954 -- # '[' -z 72854 ']' 00:05:51.021 04:56:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@958 -- # kill -0 72854 00:05:51.021 04:56:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # uname 00:05:51.021 04:56:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:51.021 04:56:10 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 72854 00:05:51.021 killing process with pid 72854 00:05:51.021 04:56:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:51.021 04:56:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:51.021 04:56:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 72854' 00:05:51.021 04:56:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@973 -- # kill 72854 00:05:51.021 04:56:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@978 -- # wait 72854 00:05:51.281 00:05:51.281 real 0m1.392s 00:05:51.281 user 0m1.452s 00:05:51.281 sys 0m0.408s 00:05:51.281 04:56:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:51.281 ************************************ 00:05:51.281 END TEST default_locks_via_rpc 00:05:51.281 ************************************ 00:05:51.281 04:56:11 event.cpu_locks.default_locks_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:51.281 04:56:11 event.cpu_locks -- event/cpu_locks.sh@168 -- # run_test non_locking_app_on_locked_coremask non_locking_app_on_locked_coremask 00:05:51.281 04:56:11 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:51.281 04:56:11 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:51.281 04:56:11 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:05:51.281 ************************************ 00:05:51.281 START TEST non_locking_app_on_locked_coremask 00:05:51.281 ************************************ 00:05:51.281 04:56:11 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # non_locking_app_on_locked_coremask 00:05:51.281 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:51.281 04:56:11 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@80 -- # spdk_tgt_pid=72900 00:05:51.281 04:56:11 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@81 -- # waitforlisten 72900 /var/tmp/spdk.sock 00:05:51.281 04:56:11 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 72900 ']' 00:05:51.281 04:56:11 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:51.281 04:56:11 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:51.281 04:56:11 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:51.281 04:56:11 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:51.281 04:56:11 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:05:51.281 04:56:11 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@79 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 00:05:51.281 [2024-12-15 04:56:11.364832] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:51.281 [2024-12-15 04:56:11.365104] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid72900 ] 00:05:51.540 [2024-12-15 04:56:11.513959] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:51.540 [2024-12-15 04:56:11.530979] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:52.112 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:52.112 04:56:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:52.112 04:56:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:05:52.112 04:56:12 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@83 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks -r /var/tmp/spdk2.sock 00:05:52.112 04:56:12 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@84 -- # spdk_tgt_pid2=72911 00:05:52.112 04:56:12 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@85 -- # waitforlisten 72911 /var/tmp/spdk2.sock 00:05:52.112 04:56:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 72911 ']' 00:05:52.112 04:56:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:52.112 04:56:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:52.112 04:56:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:52.112 04:56:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:52.112 04:56:12 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:05:52.373 [2024-12-15 04:56:12.293565] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:52.373 [2024-12-15 04:56:12.293673] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid72911 ] 00:05:52.373 [2024-12-15 04:56:12.453574] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:52.373 [2024-12-15 04:56:12.453616] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:52.373 [2024-12-15 04:56:12.486136] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:53.317 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:53.317 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:05:53.317 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@87 -- # locks_exist 72900 00:05:53.317 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 72900 00:05:53.317 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:53.317 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@89 -- # killprocess 72900 00:05:53.317 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 72900 ']' 00:05:53.317 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 72900 00:05:53.317 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:05:53.317 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:53.578 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 72900 00:05:53.578 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:53.578 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:53.578 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 72900' 00:05:53.578 killing process with pid 72900 00:05:53.578 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 72900 00:05:53.578 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 72900 00:05:53.839 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- event/cpu_locks.sh@90 -- # killprocess 72911 00:05:53.839 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 72911 ']' 00:05:53.839 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 72911 00:05:53.839 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:05:53.839 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:53.839 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 72911 00:05:53.839 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:53.839 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:53.839 killing process with pid 72911 00:05:53.839 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 72911' 00:05:53.839 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 72911 00:05:53.839 04:56:13 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 72911 00:05:54.099 00:05:54.099 real 0m2.860s 00:05:54.099 user 0m3.196s 00:05:54.099 sys 0m0.766s 00:05:54.100 04:56:14 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:54.100 04:56:14 event.cpu_locks.non_locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:05:54.100 ************************************ 00:05:54.100 END TEST non_locking_app_on_locked_coremask 00:05:54.100 ************************************ 00:05:54.100 04:56:14 event.cpu_locks -- event/cpu_locks.sh@169 -- # run_test locking_app_on_unlocked_coremask locking_app_on_unlocked_coremask 00:05:54.100 04:56:14 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:54.100 04:56:14 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:54.100 04:56:14 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:05:54.100 ************************************ 00:05:54.100 START TEST locking_app_on_unlocked_coremask 00:05:54.100 ************************************ 00:05:54.100 04:56:14 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_unlocked_coremask 00:05:54.100 04:56:14 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@98 -- # spdk_tgt_pid=72969 00:05:54.100 04:56:14 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@99 -- # waitforlisten 72969 /var/tmp/spdk.sock 00:05:54.100 04:56:14 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 72969 ']' 00:05:54.100 04:56:14 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@97 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 --disable-cpumask-locks 00:05:54.100 04:56:14 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:54.100 04:56:14 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:54.100 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:54.100 04:56:14 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:54.100 04:56:14 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:54.100 04:56:14 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:05:54.362 [2024-12-15 04:56:14.278086] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:54.362 [2024-12-15 04:56:14.278205] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid72969 ] 00:05:54.362 [2024-12-15 04:56:14.431844] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:05:54.362 [2024-12-15 04:56:14.431895] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:54.362 [2024-12-15 04:56:14.448997] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.354 04:56:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:55.354 04:56:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:05:55.354 04:56:15 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@102 -- # spdk_tgt_pid2=72985 00:05:55.354 04:56:15 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@101 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:55.354 04:56:15 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@103 -- # waitforlisten 72985 /var/tmp/spdk2.sock 00:05:55.354 04:56:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@835 -- # '[' -z 72985 ']' 00:05:55.354 04:56:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:55.354 04:56:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:55.354 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:55.354 04:56:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:55.354 04:56:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:55.354 04:56:15 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:05:55.354 [2024-12-15 04:56:15.186002] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:55.354 [2024-12-15 04:56:15.186595] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid72985 ] 00:05:55.354 [2024-12-15 04:56:15.350811] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:55.354 [2024-12-15 04:56:15.383811] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:55.926 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:55.926 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@868 -- # return 0 00:05:55.926 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@105 -- # locks_exist 72985 00:05:55.926 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 72985 00:05:55.926 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:56.498 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@107 -- # killprocess 72969 00:05:56.498 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 72969 ']' 00:05:56.498 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 72969 00:05:56.498 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:05:56.498 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:56.498 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 72969 00:05:56.498 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:56.498 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:56.498 killing process with pid 72969 00:05:56.498 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 72969' 00:05:56.498 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 72969 00:05:56.498 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 72969 00:05:56.759 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- event/cpu_locks.sh@108 -- # killprocess 72985 00:05:56.759 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@954 -- # '[' -z 72985 ']' 00:05:56.759 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@958 -- # kill -0 72985 00:05:56.759 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # uname 00:05:56.759 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:56.759 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 72985 00:05:56.759 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:56.759 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:56.759 killing process with pid 72985 00:05:56.759 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 72985' 00:05:56.759 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@973 -- # kill 72985 00:05:56.759 04:56:16 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@978 -- # wait 72985 00:05:57.021 00:05:57.021 real 0m2.867s 00:05:57.021 user 0m3.208s 00:05:57.021 sys 0m0.753s 00:05:57.021 04:56:17 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:57.021 ************************************ 00:05:57.021 END TEST locking_app_on_unlocked_coremask 00:05:57.021 ************************************ 00:05:57.021 04:56:17 event.cpu_locks.locking_app_on_unlocked_coremask -- common/autotest_common.sh@10 -- # set +x 00:05:57.021 04:56:17 event.cpu_locks -- event/cpu_locks.sh@170 -- # run_test locking_app_on_locked_coremask locking_app_on_locked_coremask 00:05:57.021 04:56:17 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:57.021 04:56:17 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:57.021 04:56:17 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:05:57.021 ************************************ 00:05:57.021 START TEST locking_app_on_locked_coremask 00:05:57.021 ************************************ 00:05:57.021 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1129 -- # locking_app_on_locked_coremask 00:05:57.021 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@115 -- # spdk_tgt_pid=73043 00:05:57.021 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@114 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 00:05:57.021 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@116 -- # waitforlisten 73043 /var/tmp/spdk.sock 00:05:57.021 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:57.021 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 73043 ']' 00:05:57.021 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:57.021 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:57.021 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:57.021 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:57.021 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:05:57.282 [2024-12-15 04:56:17.224712] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:57.282 [2024-12-15 04:56:17.224831] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid73043 ] 00:05:57.282 [2024-12-15 04:56:17.373578] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:57.282 [2024-12-15 04:56:17.391404] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:58.216 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:58.216 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 0 00:05:58.216 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@119 -- # spdk_tgt_pid2=73059 00:05:58.216 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@120 -- # NOT waitforlisten 73059 /var/tmp/spdk2.sock 00:05:58.216 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@652 -- # local es=0 00:05:58.216 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@118 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 -r /var/tmp/spdk2.sock 00:05:58.216 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 73059 /var/tmp/spdk2.sock 00:05:58.216 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:05:58.216 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:05:58.216 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:05:58.216 04:56:17 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:05:58.216 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # waitforlisten 73059 /var/tmp/spdk2.sock 00:05:58.216 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@835 -- # '[' -z 73059 ']' 00:05:58.216 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:05:58.216 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:58.217 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:05:58.217 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:05:58.217 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:58.217 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:05:58.217 [2024-12-15 04:56:18.072911] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:58.217 [2024-12-15 04:56:18.073030] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid73059 ] 00:05:58.217 [2024-12-15 04:56:18.233641] app.c: 781:claim_cpu_cores: *ERROR*: Cannot create lock on core 0, probably process 73043 has claimed it. 00:05:58.217 [2024-12-15 04:56:18.233691] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:05:58.783 ERROR: process (pid: 73059) is no longer running 00:05:58.783 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 850: kill: (73059) - No such process 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@868 -- # return 1 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@655 -- # es=1 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@122 -- # locks_exist 73043 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # lslocks -p 73043 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@22 -- # grep -q spdk_cpu_lock 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- event/cpu_locks.sh@124 -- # killprocess 73043 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@954 -- # '[' -z 73043 ']' 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@958 -- # kill -0 73043 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # uname 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 73043 00:05:58.783 killing process with pid 73043 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 73043' 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@973 -- # kill 73043 00:05:58.783 04:56:18 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@978 -- # wait 73043 00:05:59.042 ************************************ 00:05:59.042 END TEST locking_app_on_locked_coremask 00:05:59.042 ************************************ 00:05:59.042 00:05:59.042 real 0m2.003s 00:05:59.042 user 0m2.217s 00:05:59.042 sys 0m0.494s 00:05:59.042 04:56:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:05:59.042 04:56:19 event.cpu_locks.locking_app_on_locked_coremask -- common/autotest_common.sh@10 -- # set +x 00:05:59.042 04:56:19 event.cpu_locks -- event/cpu_locks.sh@171 -- # run_test locking_overlapped_coremask locking_overlapped_coremask 00:05:59.042 04:56:19 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:05:59.042 04:56:19 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:05:59.042 04:56:19 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:05:59.042 ************************************ 00:05:59.042 START TEST locking_overlapped_coremask 00:05:59.042 ************************************ 00:05:59.042 04:56:19 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask 00:05:59.300 04:56:19 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@132 -- # spdk_tgt_pid=73101 00:05:59.300 04:56:19 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@133 -- # waitforlisten 73101 /var/tmp/spdk.sock 00:05:59.300 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:59.300 04:56:19 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 73101 ']' 00:05:59.300 04:56:19 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:59.300 04:56:19 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:05:59.300 04:56:19 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@131 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x7 00:05:59.300 04:56:19 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:59.300 04:56:19 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:05:59.300 04:56:19 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:05:59.300 [2024-12-15 04:56:19.255069] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:05:59.300 [2024-12-15 04:56:19.255192] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid73101 ] 00:05:59.300 [2024-12-15 04:56:19.409209] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:05:59.300 [2024-12-15 04:56:19.428002] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:05:59.300 [2024-12-15 04:56:19.428338] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:05:59.300 [2024-12-15 04:56:19.428405] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 2 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 0 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@135 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@136 -- # spdk_tgt_pid2=73119 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@137 -- # NOT waitforlisten 73119 /var/tmp/spdk2.sock 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@652 -- # local es=0 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@654 -- # valid_exec_arg waitforlisten 73119 /var/tmp/spdk2.sock 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@640 -- # local arg=waitforlisten 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # type -t waitforlisten 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # waitforlisten 73119 /var/tmp/spdk2.sock 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@835 -- # '[' -z 73119 ']' 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:00.236 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:00.236 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:00.236 [2024-12-15 04:56:20.156355] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:00.236 [2024-12-15 04:56:20.156790] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid73119 ] 00:06:00.236 [2024-12-15 04:56:20.328074] app.c: 781:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 73101 has claimed it. 00:06:00.236 [2024-12-15 04:56:20.328136] app.c: 912:spdk_app_start: *ERROR*: Unable to acquire lock on assigned core mask - exiting. 00:06:00.803 ERROR: process (pid: 73119) is no longer running 00:06:00.803 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 850: kill: (73119) - No such process 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@868 -- # return 1 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@655 -- # es=1 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@139 -- # check_remaining_locks 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- event/cpu_locks.sh@141 -- # killprocess 73101 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@954 -- # '[' -z 73101 ']' 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@958 -- # kill -0 73101 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # uname 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 73101 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:00.803 killing process with pid 73101 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@972 -- # echo 'killing process with pid 73101' 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@973 -- # kill 73101 00:06:00.803 04:56:20 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@978 -- # wait 73101 00:06:01.061 ************************************ 00:06:01.061 END TEST locking_overlapped_coremask 00:06:01.061 ************************************ 00:06:01.061 00:06:01.061 real 0m1.875s 00:06:01.061 user 0m5.221s 00:06:01.061 sys 0m0.377s 00:06:01.061 04:56:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:01.061 04:56:21 event.cpu_locks.locking_overlapped_coremask -- common/autotest_common.sh@10 -- # set +x 00:06:01.061 04:56:21 event.cpu_locks -- event/cpu_locks.sh@172 -- # run_test locking_overlapped_coremask_via_rpc locking_overlapped_coremask_via_rpc 00:06:01.061 04:56:21 event.cpu_locks -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:01.061 04:56:21 event.cpu_locks -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:01.061 04:56:21 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:06:01.061 ************************************ 00:06:01.061 START TEST locking_overlapped_coremask_via_rpc 00:06:01.061 ************************************ 00:06:01.061 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1129 -- # locking_overlapped_coremask_via_rpc 00:06:01.061 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@148 -- # spdk_tgt_pid=73161 00:06:01.061 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@149 -- # waitforlisten 73161 /var/tmp/spdk.sock 00:06:01.061 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 73161 ']' 00:06:01.061 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:01.061 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@147 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x7 --disable-cpumask-locks 00:06:01.061 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:01.061 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:01.061 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:01.061 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:01.062 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:01.062 [2024-12-15 04:56:21.173400] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:01.062 [2024-12-15 04:56:21.173541] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid73161 ] 00:06:01.322 [2024-12-15 04:56:21.326473] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:01.322 [2024-12-15 04:56:21.326508] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:01.322 [2024-12-15 04:56:21.344969] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:06:01.322 [2024-12-15 04:56:21.345190] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.322 [2024-12-15 04:56:21.345274] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 2 00:06:01.893 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:01.894 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:06:01.894 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@151 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1c -r /var/tmp/spdk2.sock --disable-cpumask-locks 00:06:01.894 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@152 -- # spdk_tgt_pid2=73179 00:06:01.894 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@153 -- # waitforlisten 73179 /var/tmp/spdk2.sock 00:06:01.894 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:01.894 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 73179 ']' 00:06:01.894 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:01.894 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:01.894 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:01.894 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:01.894 04:56:21 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:02.157 [2024-12-15 04:56:22.071542] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:02.157 [2024-12-15 04:56:22.072110] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1c --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid73179 ] 00:06:02.157 [2024-12-15 04:56:22.242241] app.c: 916:spdk_app_start: *NOTICE*: CPU core locks deactivated. 00:06:02.157 [2024-12-15 04:56:22.242284] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:02.157 [2024-12-15 04:56:22.280610] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 3 00:06:02.157 [2024-12-15 04:56:22.280681] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 2 00:06:02.157 [2024-12-15 04:56:22.280732] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 4 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@155 -- # rpc_cmd framework_enable_cpumask_locks 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@156 -- # NOT rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@652 -- # local es=0 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # rpc_cmd -s /var/tmp/spdk2.sock framework_enable_cpumask_locks 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:03.109 [2024-12-15 04:56:22.947581] app.c: 781:claim_cpu_cores: *ERROR*: Cannot create lock on core 2, probably process 73161 has claimed it. 00:06:03.109 request: 00:06:03.109 { 00:06:03.109 "method": "framework_enable_cpumask_locks", 00:06:03.109 "req_id": 1 00:06:03.109 } 00:06:03.109 Got JSON-RPC error response 00:06:03.109 response: 00:06:03.109 { 00:06:03.109 "code": -32603, 00:06:03.109 "message": "Failed to claim CPU core: 2" 00:06:03.109 } 00:06:03.109 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@655 -- # es=1 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@158 -- # waitforlisten 73161 /var/tmp/spdk.sock 00:06:03.109 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 73161 ']' 00:06:03.110 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:03.110 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:03.110 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:03.110 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:03.110 04:56:22 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:03.110 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock... 00:06:03.110 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:03.110 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:06:03.110 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@159 -- # waitforlisten 73179 /var/tmp/spdk2.sock 00:06:03.110 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@835 -- # '[' -z 73179 ']' 00:06:03.110 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk2.sock 00:06:03.110 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:03.110 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk2.sock...' 00:06:03.110 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:03.110 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:03.371 ************************************ 00:06:03.371 END TEST locking_overlapped_coremask_via_rpc 00:06:03.371 ************************************ 00:06:03.371 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:03.371 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@868 -- # return 0 00:06:03.371 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@161 -- # check_remaining_locks 00:06:03.371 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@36 -- # locks=(/var/tmp/spdk_cpu_lock_*) 00:06:03.371 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@37 -- # locks_expected=(/var/tmp/spdk_cpu_lock_{000..002}) 00:06:03.371 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- event/cpu_locks.sh@38 -- # [[ /var/tmp/spdk_cpu_lock_000 /var/tmp/spdk_cpu_lock_001 /var/tmp/spdk_cpu_lock_002 == \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\0\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\1\ \/\v\a\r\/\t\m\p\/\s\p\d\k\_\c\p\u\_\l\o\c\k\_\0\0\2 ]] 00:06:03.371 00:06:03.371 real 0m2.284s 00:06:03.371 user 0m1.061s 00:06:03.371 sys 0m0.151s 00:06:03.371 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:03.371 04:56:23 event.cpu_locks.locking_overlapped_coremask_via_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:03.371 04:56:23 event.cpu_locks -- event/cpu_locks.sh@174 -- # cleanup 00:06:03.371 04:56:23 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 73161 ]] 00:06:03.371 04:56:23 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 73161 00:06:03.371 04:56:23 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 73161 ']' 00:06:03.371 04:56:23 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 73161 00:06:03.371 04:56:23 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:06:03.371 04:56:23 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:03.371 04:56:23 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 73161 00:06:03.371 killing process with pid 73161 00:06:03.371 04:56:23 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:03.371 04:56:23 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:03.371 04:56:23 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 73161' 00:06:03.371 04:56:23 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 73161 00:06:03.371 04:56:23 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 73161 00:06:03.630 04:56:23 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 73179 ]] 00:06:03.630 04:56:23 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 73179 00:06:03.630 04:56:23 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 73179 ']' 00:06:03.630 04:56:23 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 73179 00:06:03.630 04:56:23 event.cpu_locks -- common/autotest_common.sh@959 -- # uname 00:06:03.630 04:56:23 event.cpu_locks -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:03.630 04:56:23 event.cpu_locks -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 73179 00:06:03.630 killing process with pid 73179 00:06:03.630 04:56:23 event.cpu_locks -- common/autotest_common.sh@960 -- # process_name=reactor_2 00:06:03.630 04:56:23 event.cpu_locks -- common/autotest_common.sh@964 -- # '[' reactor_2 = sudo ']' 00:06:03.630 04:56:23 event.cpu_locks -- common/autotest_common.sh@972 -- # echo 'killing process with pid 73179' 00:06:03.630 04:56:23 event.cpu_locks -- common/autotest_common.sh@973 -- # kill 73179 00:06:03.630 04:56:23 event.cpu_locks -- common/autotest_common.sh@978 -- # wait 73179 00:06:03.889 04:56:23 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:06:03.889 04:56:23 event.cpu_locks -- event/cpu_locks.sh@1 -- # cleanup 00:06:03.889 04:56:23 event.cpu_locks -- event/cpu_locks.sh@15 -- # [[ -z 73161 ]] 00:06:03.889 04:56:23 event.cpu_locks -- event/cpu_locks.sh@15 -- # killprocess 73161 00:06:03.889 04:56:23 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 73161 ']' 00:06:03.889 04:56:23 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 73161 00:06:03.889 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 958: kill: (73161) - No such process 00:06:03.889 Process with pid 73161 is not found 00:06:03.889 Process with pid 73179 is not found 00:06:03.889 04:56:23 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 73161 is not found' 00:06:03.889 04:56:23 event.cpu_locks -- event/cpu_locks.sh@16 -- # [[ -z 73179 ]] 00:06:03.889 04:56:23 event.cpu_locks -- event/cpu_locks.sh@16 -- # killprocess 73179 00:06:03.889 04:56:23 event.cpu_locks -- common/autotest_common.sh@954 -- # '[' -z 73179 ']' 00:06:03.889 04:56:23 event.cpu_locks -- common/autotest_common.sh@958 -- # kill -0 73179 00:06:03.889 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 958: kill: (73179) - No such process 00:06:03.889 04:56:23 event.cpu_locks -- common/autotest_common.sh@981 -- # echo 'Process with pid 73179 is not found' 00:06:03.889 04:56:23 event.cpu_locks -- event/cpu_locks.sh@18 -- # rm -f 00:06:03.889 00:06:03.889 real 0m15.600s 00:06:03.889 user 0m27.965s 00:06:03.889 sys 0m4.043s 00:06:03.889 04:56:23 event.cpu_locks -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:03.889 04:56:23 event.cpu_locks -- common/autotest_common.sh@10 -- # set +x 00:06:03.889 ************************************ 00:06:03.889 END TEST cpu_locks 00:06:03.889 ************************************ 00:06:03.889 00:06:03.889 real 0m40.733s 00:06:03.889 user 1m17.877s 00:06:03.889 sys 0m6.998s 00:06:03.889 04:56:23 event -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:03.889 04:56:23 event -- common/autotest_common.sh@10 -- # set +x 00:06:03.889 ************************************ 00:06:03.889 END TEST event 00:06:03.889 ************************************ 00:06:03.889 04:56:24 -- spdk/autotest.sh@169 -- # run_test thread /home/vagrant/spdk_repo/spdk/test/thread/thread.sh 00:06:03.889 04:56:24 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:03.889 04:56:24 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:03.889 04:56:24 -- common/autotest_common.sh@10 -- # set +x 00:06:03.889 ************************************ 00:06:03.889 START TEST thread 00:06:03.889 ************************************ 00:06:03.889 04:56:24 thread -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/thread/thread.sh 00:06:04.147 * Looking for test storage... 00:06:04.147 * Found test storage at /home/vagrant/spdk_repo/spdk/test/thread 00:06:04.147 04:56:24 thread -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:06:04.147 04:56:24 thread -- common/autotest_common.sh@1711 -- # lcov --version 00:06:04.147 04:56:24 thread -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:06:04.147 04:56:24 thread -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:06:04.147 04:56:24 thread -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:04.147 04:56:24 thread -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:04.147 04:56:24 thread -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:04.147 04:56:24 thread -- scripts/common.sh@336 -- # IFS=.-: 00:06:04.147 04:56:24 thread -- scripts/common.sh@336 -- # read -ra ver1 00:06:04.147 04:56:24 thread -- scripts/common.sh@337 -- # IFS=.-: 00:06:04.147 04:56:24 thread -- scripts/common.sh@337 -- # read -ra ver2 00:06:04.147 04:56:24 thread -- scripts/common.sh@338 -- # local 'op=<' 00:06:04.147 04:56:24 thread -- scripts/common.sh@340 -- # ver1_l=2 00:06:04.147 04:56:24 thread -- scripts/common.sh@341 -- # ver2_l=1 00:06:04.147 04:56:24 thread -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:04.147 04:56:24 thread -- scripts/common.sh@344 -- # case "$op" in 00:06:04.147 04:56:24 thread -- scripts/common.sh@345 -- # : 1 00:06:04.147 04:56:24 thread -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:04.147 04:56:24 thread -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:04.147 04:56:24 thread -- scripts/common.sh@365 -- # decimal 1 00:06:04.147 04:56:24 thread -- scripts/common.sh@353 -- # local d=1 00:06:04.147 04:56:24 thread -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:04.147 04:56:24 thread -- scripts/common.sh@355 -- # echo 1 00:06:04.147 04:56:24 thread -- scripts/common.sh@365 -- # ver1[v]=1 00:06:04.147 04:56:24 thread -- scripts/common.sh@366 -- # decimal 2 00:06:04.148 04:56:24 thread -- scripts/common.sh@353 -- # local d=2 00:06:04.148 04:56:24 thread -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:04.148 04:56:24 thread -- scripts/common.sh@355 -- # echo 2 00:06:04.148 04:56:24 thread -- scripts/common.sh@366 -- # ver2[v]=2 00:06:04.148 04:56:24 thread -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:04.148 04:56:24 thread -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:04.148 04:56:24 thread -- scripts/common.sh@368 -- # return 0 00:06:04.148 04:56:24 thread -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:04.148 04:56:24 thread -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:06:04.148 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:04.148 --rc genhtml_branch_coverage=1 00:06:04.148 --rc genhtml_function_coverage=1 00:06:04.148 --rc genhtml_legend=1 00:06:04.148 --rc geninfo_all_blocks=1 00:06:04.148 --rc geninfo_unexecuted_blocks=1 00:06:04.148 00:06:04.148 ' 00:06:04.148 04:56:24 thread -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:06:04.148 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:04.148 --rc genhtml_branch_coverage=1 00:06:04.148 --rc genhtml_function_coverage=1 00:06:04.148 --rc genhtml_legend=1 00:06:04.148 --rc geninfo_all_blocks=1 00:06:04.148 --rc geninfo_unexecuted_blocks=1 00:06:04.148 00:06:04.148 ' 00:06:04.148 04:56:24 thread -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:06:04.148 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:04.148 --rc genhtml_branch_coverage=1 00:06:04.148 --rc genhtml_function_coverage=1 00:06:04.148 --rc genhtml_legend=1 00:06:04.148 --rc geninfo_all_blocks=1 00:06:04.148 --rc geninfo_unexecuted_blocks=1 00:06:04.148 00:06:04.148 ' 00:06:04.148 04:56:24 thread -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:06:04.148 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:04.148 --rc genhtml_branch_coverage=1 00:06:04.148 --rc genhtml_function_coverage=1 00:06:04.148 --rc genhtml_legend=1 00:06:04.148 --rc geninfo_all_blocks=1 00:06:04.148 --rc geninfo_unexecuted_blocks=1 00:06:04.148 00:06:04.148 ' 00:06:04.148 04:56:24 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /home/vagrant/spdk_repo/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:04.148 04:56:24 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:06:04.148 04:56:24 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:04.148 04:56:24 thread -- common/autotest_common.sh@10 -- # set +x 00:06:04.148 ************************************ 00:06:04.148 START TEST thread_poller_perf 00:06:04.148 ************************************ 00:06:04.148 04:56:24 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:04.148 [2024-12-15 04:56:24.188858] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:04.148 [2024-12-15 04:56:24.189059] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid73306 ] 00:06:04.406 [2024-12-15 04:56:24.340554] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:04.406 [2024-12-15 04:56:24.358817] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:04.406 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:05.340 [2024-12-15T04:56:25.480Z] ====================================== 00:06:05.340 [2024-12-15T04:56:25.480Z] busy:2607192600 (cyc) 00:06:05.340 [2024-12-15T04:56:25.480Z] total_run_count: 413000 00:06:05.340 [2024-12-15T04:56:25.480Z] tsc_hz: 2600000000 (cyc) 00:06:05.340 [2024-12-15T04:56:25.480Z] ====================================== 00:06:05.340 [2024-12-15T04:56:25.480Z] poller_cost: 6312 (cyc), 2427 (nsec) 00:06:05.340 00:06:05.340 real 0m1.239s 00:06:05.340 user 0m1.078s 00:06:05.340 sys 0m0.055s 00:06:05.340 04:56:25 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:05.340 ************************************ 00:06:05.340 04:56:25 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:05.340 END TEST thread_poller_perf 00:06:05.340 ************************************ 00:06:05.340 04:56:25 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /home/vagrant/spdk_repo/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:05.340 04:56:25 thread -- common/autotest_common.sh@1105 -- # '[' 8 -le 1 ']' 00:06:05.340 04:56:25 thread -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:05.340 04:56:25 thread -- common/autotest_common.sh@10 -- # set +x 00:06:05.340 ************************************ 00:06:05.340 START TEST thread_poller_perf 00:06:05.340 ************************************ 00:06:05.340 04:56:25 thread.thread_poller_perf -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:05.598 [2024-12-15 04:56:25.496681] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:05.598 [2024-12-15 04:56:25.496938] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid73337 ] 00:06:05.598 [2024-12-15 04:56:25.651579] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:05.598 [2024-12-15 04:56:25.667899] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.598 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:06.974 [2024-12-15T04:56:27.114Z] ====================================== 00:06:06.974 [2024-12-15T04:56:27.114Z] busy:2602465810 (cyc) 00:06:06.974 [2024-12-15T04:56:27.115Z] total_run_count: 4902000 00:06:06.975 [2024-12-15T04:56:27.115Z] tsc_hz: 2600000000 (cyc) 00:06:06.975 [2024-12-15T04:56:27.115Z] ====================================== 00:06:06.975 [2024-12-15T04:56:27.115Z] poller_cost: 530 (cyc), 203 (nsec) 00:06:06.975 00:06:06.975 real 0m1.240s 00:06:06.975 user 0m1.072s 00:06:06.975 sys 0m0.063s 00:06:06.975 04:56:26 thread.thread_poller_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:06.975 04:56:26 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:06.975 ************************************ 00:06:06.975 END TEST thread_poller_perf 00:06:06.975 ************************************ 00:06:06.975 04:56:26 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:06:06.975 ************************************ 00:06:06.975 END TEST thread 00:06:06.975 ************************************ 00:06:06.975 00:06:06.975 real 0m2.725s 00:06:06.975 user 0m2.260s 00:06:06.975 sys 0m0.234s 00:06:06.975 04:56:26 thread -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:06.975 04:56:26 thread -- common/autotest_common.sh@10 -- # set +x 00:06:06.975 04:56:26 -- spdk/autotest.sh@171 -- # [[ 0 -eq 1 ]] 00:06:06.975 04:56:26 -- spdk/autotest.sh@176 -- # run_test app_cmdline /home/vagrant/spdk_repo/spdk/test/app/cmdline.sh 00:06:06.975 04:56:26 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:06.975 04:56:26 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:06.975 04:56:26 -- common/autotest_common.sh@10 -- # set +x 00:06:06.975 ************************************ 00:06:06.975 START TEST app_cmdline 00:06:06.975 ************************************ 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/app/cmdline.sh 00:06:06.975 * Looking for test storage... 00:06:06.975 * Found test storage at /home/vagrant/spdk_repo/spdk/test/app 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@1711 -- # lcov --version 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@336 -- # IFS=.-: 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@336 -- # read -ra ver1 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@337 -- # IFS=.-: 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@337 -- # read -ra ver2 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@338 -- # local 'op=<' 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@340 -- # ver1_l=2 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@341 -- # ver2_l=1 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@344 -- # case "$op" in 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@345 -- # : 1 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@365 -- # decimal 1 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@353 -- # local d=1 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@355 -- # echo 1 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@365 -- # ver1[v]=1 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@366 -- # decimal 2 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@353 -- # local d=2 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@355 -- # echo 2 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@366 -- # ver2[v]=2 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:06.975 04:56:26 app_cmdline -- scripts/common.sh@368 -- # return 0 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:06:06.975 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:06.975 --rc genhtml_branch_coverage=1 00:06:06.975 --rc genhtml_function_coverage=1 00:06:06.975 --rc genhtml_legend=1 00:06:06.975 --rc geninfo_all_blocks=1 00:06:06.975 --rc geninfo_unexecuted_blocks=1 00:06:06.975 00:06:06.975 ' 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:06:06.975 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:06.975 --rc genhtml_branch_coverage=1 00:06:06.975 --rc genhtml_function_coverage=1 00:06:06.975 --rc genhtml_legend=1 00:06:06.975 --rc geninfo_all_blocks=1 00:06:06.975 --rc geninfo_unexecuted_blocks=1 00:06:06.975 00:06:06.975 ' 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:06:06.975 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:06.975 --rc genhtml_branch_coverage=1 00:06:06.975 --rc genhtml_function_coverage=1 00:06:06.975 --rc genhtml_legend=1 00:06:06.975 --rc geninfo_all_blocks=1 00:06:06.975 --rc geninfo_unexecuted_blocks=1 00:06:06.975 00:06:06.975 ' 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:06:06.975 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:06.975 --rc genhtml_branch_coverage=1 00:06:06.975 --rc genhtml_function_coverage=1 00:06:06.975 --rc genhtml_legend=1 00:06:06.975 --rc geninfo_all_blocks=1 00:06:06.975 --rc geninfo_unexecuted_blocks=1 00:06:06.975 00:06:06.975 ' 00:06:06.975 04:56:26 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:06:06.975 04:56:26 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=73426 00:06:06.975 04:56:26 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 73426 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@835 -- # '[' -z 73426 ']' 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:06.975 04:56:26 app_cmdline -- app/cmdline.sh@16 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:06.975 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:06.975 04:56:26 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:06:06.975 [2024-12-15 04:56:27.008404] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:06.975 [2024-12-15 04:56:27.008775] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid73426 ] 00:06:07.234 [2024-12-15 04:56:27.162240] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:07.234 [2024-12-15 04:56:27.179179] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:07.800 04:56:27 app_cmdline -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:07.800 04:56:27 app_cmdline -- common/autotest_common.sh@868 -- # return 0 00:06:07.800 04:56:27 app_cmdline -- app/cmdline.sh@20 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py spdk_get_version 00:06:08.059 { 00:06:08.059 "version": "SPDK v25.01-pre git sha1 e01cb43b8", 00:06:08.059 "fields": { 00:06:08.059 "major": 25, 00:06:08.059 "minor": 1, 00:06:08.059 "patch": 0, 00:06:08.059 "suffix": "-pre", 00:06:08.059 "commit": "e01cb43b8" 00:06:08.059 } 00:06:08.059 } 00:06:08.059 04:56:28 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:06:08.059 04:56:28 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:06:08.059 04:56:28 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:06:08.059 04:56:28 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:06:08.059 04:56:28 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:06:08.059 04:56:28 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:06:08.059 04:56:28 app_cmdline -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:08.059 04:56:28 app_cmdline -- app/cmdline.sh@26 -- # sort 00:06:08.059 04:56:28 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:06:08.059 04:56:28 app_cmdline -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:08.059 04:56:28 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:06:08.059 04:56:28 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:06:08.059 04:56:28 app_cmdline -- app/cmdline.sh@30 -- # NOT /home/vagrant/spdk_repo/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:08.059 04:56:28 app_cmdline -- common/autotest_common.sh@652 -- # local es=0 00:06:08.059 04:56:28 app_cmdline -- common/autotest_common.sh@654 -- # valid_exec_arg /home/vagrant/spdk_repo/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:08.059 04:56:28 app_cmdline -- common/autotest_common.sh@640 -- # local arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:06:08.059 04:56:28 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:08.059 04:56:28 app_cmdline -- common/autotest_common.sh@644 -- # type -t /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:06:08.059 04:56:28 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:08.059 04:56:28 app_cmdline -- common/autotest_common.sh@646 -- # type -P /home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:06:08.059 04:56:28 app_cmdline -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:06:08.059 04:56:28 app_cmdline -- common/autotest_common.sh@646 -- # arg=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:06:08.059 04:56:28 app_cmdline -- common/autotest_common.sh@646 -- # [[ -x /home/vagrant/spdk_repo/spdk/scripts/rpc.py ]] 00:06:08.059 04:56:28 app_cmdline -- common/autotest_common.sh@655 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:06:08.318 request: 00:06:08.318 { 00:06:08.318 "method": "env_dpdk_get_mem_stats", 00:06:08.318 "req_id": 1 00:06:08.318 } 00:06:08.318 Got JSON-RPC error response 00:06:08.318 response: 00:06:08.318 { 00:06:08.318 "code": -32601, 00:06:08.318 "message": "Method not found" 00:06:08.318 } 00:06:08.318 04:56:28 app_cmdline -- common/autotest_common.sh@655 -- # es=1 00:06:08.318 04:56:28 app_cmdline -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:06:08.318 04:56:28 app_cmdline -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:06:08.318 04:56:28 app_cmdline -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:06:08.318 04:56:28 app_cmdline -- app/cmdline.sh@1 -- # killprocess 73426 00:06:08.318 04:56:28 app_cmdline -- common/autotest_common.sh@954 -- # '[' -z 73426 ']' 00:06:08.318 04:56:28 app_cmdline -- common/autotest_common.sh@958 -- # kill -0 73426 00:06:08.318 04:56:28 app_cmdline -- common/autotest_common.sh@959 -- # uname 00:06:08.318 04:56:28 app_cmdline -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:08.318 04:56:28 app_cmdline -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 73426 00:06:08.318 killing process with pid 73426 00:06:08.319 04:56:28 app_cmdline -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:08.319 04:56:28 app_cmdline -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:08.319 04:56:28 app_cmdline -- common/autotest_common.sh@972 -- # echo 'killing process with pid 73426' 00:06:08.319 04:56:28 app_cmdline -- common/autotest_common.sh@973 -- # kill 73426 00:06:08.319 04:56:28 app_cmdline -- common/autotest_common.sh@978 -- # wait 73426 00:06:08.577 ************************************ 00:06:08.577 END TEST app_cmdline 00:06:08.577 ************************************ 00:06:08.577 00:06:08.577 real 0m1.699s 00:06:08.577 user 0m2.015s 00:06:08.577 sys 0m0.383s 00:06:08.577 04:56:28 app_cmdline -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:08.577 04:56:28 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:06:08.577 04:56:28 -- spdk/autotest.sh@177 -- # run_test version /home/vagrant/spdk_repo/spdk/test/app/version.sh 00:06:08.577 04:56:28 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:06:08.577 04:56:28 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:08.577 04:56:28 -- common/autotest_common.sh@10 -- # set +x 00:06:08.577 ************************************ 00:06:08.577 START TEST version 00:06:08.577 ************************************ 00:06:08.577 04:56:28 version -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/app/version.sh 00:06:08.577 * Looking for test storage... 00:06:08.577 * Found test storage at /home/vagrant/spdk_repo/spdk/test/app 00:06:08.577 04:56:28 version -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:06:08.577 04:56:28 version -- common/autotest_common.sh@1711 -- # lcov --version 00:06:08.577 04:56:28 version -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:06:08.577 04:56:28 version -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:06:08.577 04:56:28 version -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:08.577 04:56:28 version -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:08.577 04:56:28 version -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:08.577 04:56:28 version -- scripts/common.sh@336 -- # IFS=.-: 00:06:08.577 04:56:28 version -- scripts/common.sh@336 -- # read -ra ver1 00:06:08.578 04:56:28 version -- scripts/common.sh@337 -- # IFS=.-: 00:06:08.578 04:56:28 version -- scripts/common.sh@337 -- # read -ra ver2 00:06:08.578 04:56:28 version -- scripts/common.sh@338 -- # local 'op=<' 00:06:08.578 04:56:28 version -- scripts/common.sh@340 -- # ver1_l=2 00:06:08.578 04:56:28 version -- scripts/common.sh@341 -- # ver2_l=1 00:06:08.578 04:56:28 version -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:08.578 04:56:28 version -- scripts/common.sh@344 -- # case "$op" in 00:06:08.578 04:56:28 version -- scripts/common.sh@345 -- # : 1 00:06:08.578 04:56:28 version -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:08.578 04:56:28 version -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:08.578 04:56:28 version -- scripts/common.sh@365 -- # decimal 1 00:06:08.578 04:56:28 version -- scripts/common.sh@353 -- # local d=1 00:06:08.578 04:56:28 version -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:08.578 04:56:28 version -- scripts/common.sh@355 -- # echo 1 00:06:08.578 04:56:28 version -- scripts/common.sh@365 -- # ver1[v]=1 00:06:08.578 04:56:28 version -- scripts/common.sh@366 -- # decimal 2 00:06:08.578 04:56:28 version -- scripts/common.sh@353 -- # local d=2 00:06:08.578 04:56:28 version -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:08.578 04:56:28 version -- scripts/common.sh@355 -- # echo 2 00:06:08.578 04:56:28 version -- scripts/common.sh@366 -- # ver2[v]=2 00:06:08.578 04:56:28 version -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:08.578 04:56:28 version -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:08.578 04:56:28 version -- scripts/common.sh@368 -- # return 0 00:06:08.578 04:56:28 version -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:08.578 04:56:28 version -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:06:08.578 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:08.578 --rc genhtml_branch_coverage=1 00:06:08.578 --rc genhtml_function_coverage=1 00:06:08.578 --rc genhtml_legend=1 00:06:08.578 --rc geninfo_all_blocks=1 00:06:08.578 --rc geninfo_unexecuted_blocks=1 00:06:08.578 00:06:08.578 ' 00:06:08.578 04:56:28 version -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:06:08.578 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:08.578 --rc genhtml_branch_coverage=1 00:06:08.578 --rc genhtml_function_coverage=1 00:06:08.578 --rc genhtml_legend=1 00:06:08.578 --rc geninfo_all_blocks=1 00:06:08.578 --rc geninfo_unexecuted_blocks=1 00:06:08.578 00:06:08.578 ' 00:06:08.578 04:56:28 version -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:06:08.578 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:08.578 --rc genhtml_branch_coverage=1 00:06:08.578 --rc genhtml_function_coverage=1 00:06:08.578 --rc genhtml_legend=1 00:06:08.578 --rc geninfo_all_blocks=1 00:06:08.578 --rc geninfo_unexecuted_blocks=1 00:06:08.578 00:06:08.578 ' 00:06:08.578 04:56:28 version -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:06:08.578 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:08.578 --rc genhtml_branch_coverage=1 00:06:08.578 --rc genhtml_function_coverage=1 00:06:08.578 --rc genhtml_legend=1 00:06:08.578 --rc geninfo_all_blocks=1 00:06:08.578 --rc geninfo_unexecuted_blocks=1 00:06:08.578 00:06:08.578 ' 00:06:08.578 04:56:28 version -- app/version.sh@17 -- # get_header_version major 00:06:08.578 04:56:28 version -- app/version.sh@14 -- # cut -f2 00:06:08.578 04:56:28 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /home/vagrant/spdk_repo/spdk/include/spdk/version.h 00:06:08.578 04:56:28 version -- app/version.sh@14 -- # tr -d '"' 00:06:08.578 04:56:28 version -- app/version.sh@17 -- # major=25 00:06:08.578 04:56:28 version -- app/version.sh@18 -- # get_header_version minor 00:06:08.578 04:56:28 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /home/vagrant/spdk_repo/spdk/include/spdk/version.h 00:06:08.578 04:56:28 version -- app/version.sh@14 -- # cut -f2 00:06:08.578 04:56:28 version -- app/version.sh@14 -- # tr -d '"' 00:06:08.578 04:56:28 version -- app/version.sh@18 -- # minor=1 00:06:08.578 04:56:28 version -- app/version.sh@19 -- # get_header_version patch 00:06:08.578 04:56:28 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /home/vagrant/spdk_repo/spdk/include/spdk/version.h 00:06:08.578 04:56:28 version -- app/version.sh@14 -- # tr -d '"' 00:06:08.578 04:56:28 version -- app/version.sh@14 -- # cut -f2 00:06:08.836 04:56:28 version -- app/version.sh@19 -- # patch=0 00:06:08.836 04:56:28 version -- app/version.sh@20 -- # get_header_version suffix 00:06:08.836 04:56:28 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /home/vagrant/spdk_repo/spdk/include/spdk/version.h 00:06:08.836 04:56:28 version -- app/version.sh@14 -- # cut -f2 00:06:08.836 04:56:28 version -- app/version.sh@14 -- # tr -d '"' 00:06:08.836 04:56:28 version -- app/version.sh@20 -- # suffix=-pre 00:06:08.836 04:56:28 version -- app/version.sh@22 -- # version=25.1 00:06:08.836 04:56:28 version -- app/version.sh@25 -- # (( patch != 0 )) 00:06:08.836 04:56:28 version -- app/version.sh@28 -- # version=25.1rc0 00:06:08.836 04:56:28 version -- app/version.sh@30 -- # PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python 00:06:08.836 04:56:28 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:06:08.836 04:56:28 version -- app/version.sh@30 -- # py_version=25.1rc0 00:06:08.836 04:56:28 version -- app/version.sh@31 -- # [[ 25.1rc0 == \2\5\.\1\r\c\0 ]] 00:06:08.836 00:06:08.836 real 0m0.203s 00:06:08.836 user 0m0.124s 00:06:08.836 sys 0m0.104s 00:06:08.836 ************************************ 00:06:08.836 END TEST version 00:06:08.836 ************************************ 00:06:08.836 04:56:28 version -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:08.836 04:56:28 version -- common/autotest_common.sh@10 -- # set +x 00:06:08.836 04:56:28 -- spdk/autotest.sh@179 -- # '[' 0 -eq 1 ']' 00:06:08.836 04:56:28 -- spdk/autotest.sh@188 -- # [[ 0 -eq 1 ]] 00:06:08.836 04:56:28 -- spdk/autotest.sh@194 -- # uname -s 00:06:08.836 04:56:28 -- spdk/autotest.sh@194 -- # [[ Linux == Linux ]] 00:06:08.836 04:56:28 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:06:08.836 04:56:28 -- spdk/autotest.sh@195 -- # [[ 0 -eq 1 ]] 00:06:08.836 04:56:28 -- spdk/autotest.sh@207 -- # '[' 1 -eq 1 ']' 00:06:08.836 04:56:28 -- spdk/autotest.sh@208 -- # run_test blockdev_nvme /home/vagrant/spdk_repo/spdk/test/bdev/blockdev.sh nvme 00:06:08.836 04:56:28 -- common/autotest_common.sh@1105 -- # '[' 3 -le 1 ']' 00:06:08.836 04:56:28 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:08.836 04:56:28 -- common/autotest_common.sh@10 -- # set +x 00:06:08.836 ************************************ 00:06:08.836 START TEST blockdev_nvme 00:06:08.836 ************************************ 00:06:08.836 04:56:28 blockdev_nvme -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/bdev/blockdev.sh nvme 00:06:08.836 * Looking for test storage... 00:06:08.836 * Found test storage at /home/vagrant/spdk_repo/spdk/test/bdev 00:06:08.836 04:56:28 blockdev_nvme -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:06:08.836 04:56:28 blockdev_nvme -- common/autotest_common.sh@1711 -- # lcov --version 00:06:08.836 04:56:28 blockdev_nvme -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:06:08.836 04:56:28 blockdev_nvme -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:06:08.836 04:56:28 blockdev_nvme -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:08.836 04:56:28 blockdev_nvme -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:08.836 04:56:28 blockdev_nvme -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:08.836 04:56:28 blockdev_nvme -- scripts/common.sh@336 -- # IFS=.-: 00:06:08.836 04:56:28 blockdev_nvme -- scripts/common.sh@336 -- # read -ra ver1 00:06:08.836 04:56:28 blockdev_nvme -- scripts/common.sh@337 -- # IFS=.-: 00:06:08.836 04:56:28 blockdev_nvme -- scripts/common.sh@337 -- # read -ra ver2 00:06:08.836 04:56:28 blockdev_nvme -- scripts/common.sh@338 -- # local 'op=<' 00:06:08.836 04:56:28 blockdev_nvme -- scripts/common.sh@340 -- # ver1_l=2 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@341 -- # ver2_l=1 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@344 -- # case "$op" in 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@345 -- # : 1 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@365 -- # decimal 1 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@353 -- # local d=1 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@355 -- # echo 1 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@365 -- # ver1[v]=1 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@366 -- # decimal 2 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@353 -- # local d=2 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@355 -- # echo 2 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@366 -- # ver2[v]=2 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:08.837 04:56:28 blockdev_nvme -- scripts/common.sh@368 -- # return 0 00:06:08.837 04:56:28 blockdev_nvme -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:08.837 04:56:28 blockdev_nvme -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:06:08.837 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:08.837 --rc genhtml_branch_coverage=1 00:06:08.837 --rc genhtml_function_coverage=1 00:06:08.837 --rc genhtml_legend=1 00:06:08.837 --rc geninfo_all_blocks=1 00:06:08.837 --rc geninfo_unexecuted_blocks=1 00:06:08.837 00:06:08.837 ' 00:06:08.837 04:56:28 blockdev_nvme -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:06:08.837 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:08.837 --rc genhtml_branch_coverage=1 00:06:08.837 --rc genhtml_function_coverage=1 00:06:08.837 --rc genhtml_legend=1 00:06:08.837 --rc geninfo_all_blocks=1 00:06:08.837 --rc geninfo_unexecuted_blocks=1 00:06:08.837 00:06:08.837 ' 00:06:08.837 04:56:28 blockdev_nvme -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:06:08.837 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:08.837 --rc genhtml_branch_coverage=1 00:06:08.837 --rc genhtml_function_coverage=1 00:06:08.837 --rc genhtml_legend=1 00:06:08.837 --rc geninfo_all_blocks=1 00:06:08.837 --rc geninfo_unexecuted_blocks=1 00:06:08.837 00:06:08.837 ' 00:06:08.837 04:56:28 blockdev_nvme -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:06:08.837 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:08.837 --rc genhtml_branch_coverage=1 00:06:08.837 --rc genhtml_function_coverage=1 00:06:08.837 --rc genhtml_legend=1 00:06:08.837 --rc geninfo_all_blocks=1 00:06:08.837 --rc geninfo_unexecuted_blocks=1 00:06:08.837 00:06:08.837 ' 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@10 -- # source /home/vagrant/spdk_repo/spdk/test/bdev/nbd_common.sh 00:06:08.837 04:56:28 blockdev_nvme -- bdev/nbd_common.sh@6 -- # set -e 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@13 -- # conf_file=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/home/vagrant/spdk_repo/spdk/test/bdev/nonenclosed.json 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/home/vagrant/spdk_repo/spdk/test/bdev/nonarray.json 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@20 -- # : 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@707 -- # QOS_DEV_1=Malloc_0 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@708 -- # QOS_DEV_2=Null_1 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@709 -- # QOS_RUN_TIME=5 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@711 -- # uname -s 00:06:08.837 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@711 -- # '[' Linux = Linux ']' 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@713 -- # PRE_RESERVED_MEM=0 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@719 -- # test_type=nvme 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@720 -- # crypto_device= 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@721 -- # dek= 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@722 -- # env_ctx= 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@723 -- # wait_for_rpc= 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@724 -- # '[' -n '' ']' 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@727 -- # [[ nvme == bdev ]] 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@727 -- # [[ nvme == crypto_* ]] 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@730 -- # start_spdk_tgt 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=73587 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@49 -- # waitforlisten 73587 00:06:08.837 04:56:28 blockdev_nvme -- common/autotest_common.sh@835 -- # '[' -z 73587 ']' 00:06:08.837 04:56:28 blockdev_nvme -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:08.837 04:56:28 blockdev_nvme -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:08.837 04:56:28 blockdev_nvme -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:08.837 04:56:28 blockdev_nvme -- bdev/blockdev.sh@46 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt '' '' 00:06:08.837 04:56:28 blockdev_nvme -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:08.837 04:56:28 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:09.096 [2024-12-15 04:56:29.009424] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:09.096 [2024-12-15 04:56:29.009706] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid73587 ] 00:06:09.096 [2024-12-15 04:56:29.165174] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:09.096 [2024-12-15 04:56:29.181978] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:09.662 04:56:29 blockdev_nvme -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:09.662 04:56:29 blockdev_nvme -- common/autotest_common.sh@868 -- # return 0 00:06:09.662 04:56:29 blockdev_nvme -- bdev/blockdev.sh@731 -- # case "$test_type" in 00:06:09.662 04:56:29 blockdev_nvme -- bdev/blockdev.sh@736 -- # setup_nvme_conf 00:06:09.662 04:56:29 blockdev_nvme -- bdev/blockdev.sh@81 -- # local json 00:06:09.662 04:56:29 blockdev_nvme -- bdev/blockdev.sh@82 -- # mapfile -t json 00:06:09.662 04:56:29 blockdev_nvme -- bdev/blockdev.sh@82 -- # /home/vagrant/spdk_repo/spdk/scripts/gen_nvme.sh 00:06:09.920 04:56:29 blockdev_nvme -- bdev/blockdev.sh@83 -- # rpc_cmd load_subsystem_config -j ''\''{ "subsystem": "bdev", "config": [ { "method": "bdev_nvme_attach_controller", "params": { "trtype": "PCIe", "name":"Nvme0", "traddr":"0000:00:10.0" } },{ "method": "bdev_nvme_attach_controller", "params": { "trtype": "PCIe", "name":"Nvme1", "traddr":"0000:00:11.0" } },{ "method": "bdev_nvme_attach_controller", "params": { "trtype": "PCIe", "name":"Nvme2", "traddr":"0000:00:12.0" } },{ "method": "bdev_nvme_attach_controller", "params": { "trtype": "PCIe", "name":"Nvme3", "traddr":"0000:00:13.0" } } ] }'\''' 00:06:09.920 04:56:29 blockdev_nvme -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:09.920 04:56:29 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@774 -- # rpc_cmd bdev_wait_for_examine 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@777 -- # cat 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@777 -- # rpc_cmd save_subsystem_config -n accel 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@777 -- # rpc_cmd save_subsystem_config -n bdev 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@777 -- # rpc_cmd save_subsystem_config -n iobuf 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@785 -- # mapfile -t bdevs 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@785 -- # jq -r '.[] | select(.claimed == false)' 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@785 -- # rpc_cmd bdev_get_bdevs 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@786 -- # mapfile -t bdevs_name 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@786 -- # jq -r .name 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@786 -- # printf '%s\n' '{' ' "name": "Nvme0n1",' ' "aliases": [' ' "48ee66d2-fd9c-41ad-be2c-dbdbc72711b8"' ' ],' ' "product_name": "NVMe disk",' ' "block_size": 4096,' ' "num_blocks": 1548666,' ' "uuid": "48ee66d2-fd9c-41ad-be2c-dbdbc72711b8",' ' "numa_id": -1,' ' "md_size": 64,' ' "md_interleave": false,' ' "dif_type": 0,' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": true,' ' "nvme_io": true,' ' "nvme_io_md": true,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": true,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "nvme": [' ' {' ' "pci_address": "0000:00:10.0",' ' "trid": {' ' "trtype": "PCIe",' ' "traddr": "0000:00:10.0"' ' },' ' "ctrlr_data": {' ' "cntlid": 0,' ' "vendor_id": "0x1b36",' ' "model_number": "QEMU NVMe Ctrl",' ' "serial_number": "12340",' ' "firmware_revision": "8.0.0",' ' "subnqn": "nqn.2019-08.org.qemu:12340",' ' "oacs": {' ' "security": 0,' ' "format": 1,' ' "firmware": 0,' ' "ns_manage": 1' ' },' ' "multi_ctrlr": false,' ' "ana_reporting": false' ' },' ' "vs": {' ' "nvme_version": "1.4"' ' },' ' "ns_data": {' ' "id": 1,' ' "can_share": false' ' }' ' }' ' ],' ' "mp_policy": "active_passive"' ' }' '}' '{' ' "name": "Nvme1n1",' ' "aliases": [' ' "318c89a7-cf1f-47b4-80d8-b27f42af8ad6"' ' ],' ' "product_name": "NVMe disk",' ' "block_size": 4096,' ' "num_blocks": 1310720,' ' "uuid": "318c89a7-cf1f-47b4-80d8-b27f42af8ad6",' ' "numa_id": -1,' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": true,' ' "nvme_io": true,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": true,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "nvme": [' ' {' ' "pci_address": "0000:00:11.0",' ' "trid": {' ' "trtype": "PCIe",' ' "traddr": "0000:00:11.0"' ' },' ' "ctrlr_data": {' ' "cntlid": 0,' ' "vendor_id": "0x1b36",' ' "model_number": "QEMU NVMe Ctrl",' ' "serial_number": "12341",' ' "firmware_revision": "8.0.0",' ' "subnqn": "nqn.2019-08.org.qemu:12341",' ' "oacs": {' ' "security": 0,' ' "format": 1,' ' "firmware": 0,' ' "ns_manage": 1' ' },' ' "multi_ctrlr": false,' ' "ana_reporting": false' ' },' ' "vs": {' ' "nvme_version": "1.4"' ' },' ' "ns_data": {' ' "id": 1,' ' "can_share": false' ' }' ' }' ' ],' ' "mp_policy": "active_passive"' ' }' '}' '{' ' "name": "Nvme2n1",' ' "aliases": [' ' "e5584d8a-2b7a-4342-959b-50a668ea6f92"' ' ],' ' "product_name": "NVMe disk",' ' "block_size": 4096,' ' "num_blocks": 1048576,' ' "uuid": "e5584d8a-2b7a-4342-959b-50a668ea6f92",' ' "numa_id": -1,' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": true,' ' "nvme_io": true,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": true,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "nvme": [' ' {' ' "pci_address": "0000:00:12.0",' ' "trid": {' ' "trtype": "PCIe",' ' "traddr": "0000:00:12.0"' ' },' ' "ctrlr_data": {' ' "cntlid": 0,' ' "vendor_id": "0x1b36",' ' "model_number": "QEMU NVMe Ctrl",' ' "serial_number": "12342",' ' "firmware_revision": "8.0.0",' ' "subnqn": "nqn.2019-08.org.qemu:12342",' ' "oacs": {' ' "security": 0,' ' "format": 1,' ' "firmware": 0,' ' "ns_manage": 1' ' },' ' "multi_ctrlr": false,' ' "ana_reporting": false' ' },' ' "vs": {' ' "nvme_version": "1.4"' ' },' ' "ns_data": {' ' "id": 1,' ' "can_share": false' ' }' ' }' ' ],' ' "mp_policy": "active_passive"' ' }' '}' '{' ' "name": "Nvme2n2",' ' "aliases": [' ' "6296e32b-0b3c-4ea1-bcae-a34018e159b3"' ' ],' ' "product_name": "NVMe disk",' ' "block_size": 4096,' ' "num_blocks": 1048576,' ' "uuid": "6296e32b-0b3c-4ea1-bcae-a34018e159b3",' ' "numa_id": -1,' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": true,' ' "nvme_io": true,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": true,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "nvme": [' ' {' ' "pci_address": "0000:00:12.0",' ' "trid": {' ' "trtype": "PCIe",' ' "traddr": "0000:00:12.0"' ' },' ' "ctrlr_data": {' ' "cntlid": 0,' ' "vendor_id": "0x1b36",' ' "model_number": "QEMU NVMe Ctrl",' ' "serial_number": "12342",' ' "firmware_revision": "8.0.0",' ' "subnqn": "nqn.2019-08.org.qemu:12342",' ' "oacs": {' ' "security": 0,' ' "format": 1,' ' "firmware": 0,' ' "ns_manage": 1' ' },' ' "multi_ctrlr": false,' ' "ana_reporting": false' ' },' ' "vs": {' ' "nvme_version": "1.4"' ' },' ' "ns_data": {' ' "id": 2,' ' "can_share": false' ' }' ' }' ' ],' ' "mp_policy": "active_passive"' ' }' '}' '{' ' "name": "Nvme2n3",' ' "aliases": [' ' "d01f71f6-ef4d-4b3f-b763-0bf11fd8825c"' ' ],' ' "product_name": "NVMe disk",' ' "block_size": 4096,' ' "num_blocks": 1048576,' ' "uuid": "d01f71f6-ef4d-4b3f-b763-0bf11fd8825c",' ' "numa_id": -1,' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": true,' ' "nvme_io": true,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": true,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "nvme": [' ' {' ' "pci_address": "0000:00:12.0",' ' "trid": {' ' "trtype": "PCIe",' ' "traddr": "0000:00:12.0"' ' },' ' "ctrlr_data": {' ' "cntlid": 0,' ' "vendor_id": "0x1b36",' ' "model_number": "QEMU NVMe Ctrl",' ' "serial_number": "12342",' ' "firmware_revision": "8.0.0",' ' "subnqn": "nqn.2019-08.org.qemu:12342",' ' "oacs": {' ' "security": 0,' ' "format": 1,' ' "firmware": 0,' ' "ns_manage": 1' ' },' ' "multi_ctrlr": false,' ' "ana_reporting": false' ' },' ' "vs": {' ' "nvme_version": "1.4"' ' },' ' "ns_data": {' ' "id": 3,' ' "can_share": false' ' }' ' }' ' ],' ' "mp_policy": "active_passive"' ' }' '}' '{' ' "name": "Nvme3n1",' ' "aliases": [' ' "69dac07f-e75e-4ab2-8c0d-81d8fe4ad330"' ' ],' ' "product_name": "NVMe disk",' ' "block_size": 4096,' ' "num_blocks": 262144,' ' "uuid": "69dac07f-e75e-4ab2-8c0d-81d8fe4ad330",' ' "numa_id": -1,' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": true,' ' "nvme_io": true,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": true,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "nvme": [' ' {' ' "pci_address": "0000:00:13.0",' ' "trid": {' ' "trtype": "PCIe",' ' "traddr": "0000:00:13.0"' ' },' ' "ctrlr_data": {' ' "cntlid": 0,' ' "vendor_id": "0x1b36",' ' "model_number": "QEMU NVMe Ctrl",' ' "serial_number": "12343",' ' "firmware_revision": "8.0.0",' ' "subnqn": "nqn.2019-08.org.qemu:fdp-subsys3",' ' "oacs": {' ' "security": 0,' ' "format": 1,' ' "firmware": 0,' ' "ns_manage": 1' ' },' ' "multi_ctrlr": true,' ' "ana_reporting": false' ' },' ' "vs": {' ' "nvme_version": "1.4"' ' },' ' "ns_data": {' ' "id": 1,' ' "can_share": true' ' }' ' }' ' ],' ' "mp_policy": "active_passive"' ' }' '}' 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@787 -- # bdev_list=("${bdevs_name[@]}") 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@789 -- # hello_world_bdev=Nvme0n1 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@790 -- # trap - SIGINT SIGTERM EXIT 00:06:10.178 04:56:30 blockdev_nvme -- bdev/blockdev.sh@791 -- # killprocess 73587 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@954 -- # '[' -z 73587 ']' 00:06:10.178 04:56:30 blockdev_nvme -- common/autotest_common.sh@958 -- # kill -0 73587 00:06:10.179 04:56:30 blockdev_nvme -- common/autotest_common.sh@959 -- # uname 00:06:10.179 04:56:30 blockdev_nvme -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:10.179 04:56:30 blockdev_nvme -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 73587 00:06:10.179 killing process with pid 73587 00:06:10.179 04:56:30 blockdev_nvme -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:10.179 04:56:30 blockdev_nvme -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:10.179 04:56:30 blockdev_nvme -- common/autotest_common.sh@972 -- # echo 'killing process with pid 73587' 00:06:10.179 04:56:30 blockdev_nvme -- common/autotest_common.sh@973 -- # kill 73587 00:06:10.179 04:56:30 blockdev_nvme -- common/autotest_common.sh@978 -- # wait 73587 00:06:10.436 04:56:30 blockdev_nvme -- bdev/blockdev.sh@795 -- # trap cleanup SIGINT SIGTERM EXIT 00:06:10.436 04:56:30 blockdev_nvme -- bdev/blockdev.sh@797 -- # run_test bdev_hello_world /home/vagrant/spdk_repo/spdk/build/examples/hello_bdev --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -b Nvme0n1 '' 00:06:10.436 04:56:30 blockdev_nvme -- common/autotest_common.sh@1105 -- # '[' 7 -le 1 ']' 00:06:10.436 04:56:30 blockdev_nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:10.436 04:56:30 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:10.436 ************************************ 00:06:10.436 START TEST bdev_hello_world 00:06:10.436 ************************************ 00:06:10.436 04:56:30 blockdev_nvme.bdev_hello_world -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/hello_bdev --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -b Nvme0n1 '' 00:06:10.694 [2024-12-15 04:56:30.582912] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:10.694 [2024-12-15 04:56:30.583142] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid73649 ] 00:06:10.694 [2024-12-15 04:56:30.754131] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.694 [2024-12-15 04:56:30.777937] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.260 [2024-12-15 04:56:31.151885] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:06:11.260 [2024-12-15 04:56:31.152067] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Nvme0n1 00:06:11.260 [2024-12-15 04:56:31.152088] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:06:11.260 [2024-12-15 04:56:31.153721] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:06:11.260 [2024-12-15 04:56:31.154048] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:06:11.260 [2024-12-15 04:56:31.154069] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:06:11.260 [2024-12-15 04:56:31.154255] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:06:11.260 00:06:11.260 [2024-12-15 04:56:31.154271] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:06:11.260 00:06:11.260 real 0m0.765s 00:06:11.260 user 0m0.512s 00:06:11.260 sys 0m0.151s 00:06:11.260 ************************************ 00:06:11.260 END TEST bdev_hello_world 00:06:11.260 ************************************ 00:06:11.260 04:56:31 blockdev_nvme.bdev_hello_world -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:11.260 04:56:31 blockdev_nvme.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:06:11.260 04:56:31 blockdev_nvme -- bdev/blockdev.sh@798 -- # run_test bdev_bounds bdev_bounds '' 00:06:11.260 04:56:31 blockdev_nvme -- common/autotest_common.sh@1105 -- # '[' 3 -le 1 ']' 00:06:11.260 04:56:31 blockdev_nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:11.260 04:56:31 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:11.260 ************************************ 00:06:11.260 START TEST bdev_bounds 00:06:11.260 ************************************ 00:06:11.260 Process bdevio pid: 73680 00:06:11.260 04:56:31 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@1129 -- # bdev_bounds '' 00:06:11.260 04:56:31 blockdev_nvme.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=73680 00:06:11.260 04:56:31 blockdev_nvme.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:06:11.260 04:56:31 blockdev_nvme.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 73680' 00:06:11.260 04:56:31 blockdev_nvme.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 73680 00:06:11.260 04:56:31 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@835 -- # '[' -z 73680 ']' 00:06:11.260 04:56:31 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:11.260 04:56:31 blockdev_nvme.bdev_bounds -- bdev/blockdev.sh@288 -- # /home/vagrant/spdk_repo/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json '' 00:06:11.260 04:56:31 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:11.260 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:11.260 04:56:31 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:11.260 04:56:31 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:11.260 04:56:31 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:06:11.517 [2024-12-15 04:56:31.398645] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:11.517 [2024-12-15 04:56:31.398877] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid73680 ] 00:06:11.517 [2024-12-15 04:56:31.546771] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:11.517 [2024-12-15 04:56:31.565742] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:06:11.517 [2024-12-15 04:56:31.566005] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.517 [2024-12-15 04:56:31.566005] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 2 00:06:12.449 04:56:32 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:12.449 04:56:32 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@868 -- # return 0 00:06:12.449 04:56:32 blockdev_nvme.bdev_bounds -- bdev/blockdev.sh@293 -- # /home/vagrant/spdk_repo/spdk/test/bdev/bdevio/tests.py perform_tests 00:06:12.449 I/O targets: 00:06:12.449 Nvme0n1: 1548666 blocks of 4096 bytes (6050 MiB) 00:06:12.449 Nvme1n1: 1310720 blocks of 4096 bytes (5120 MiB) 00:06:12.449 Nvme2n1: 1048576 blocks of 4096 bytes (4096 MiB) 00:06:12.449 Nvme2n2: 1048576 blocks of 4096 bytes (4096 MiB) 00:06:12.449 Nvme2n3: 1048576 blocks of 4096 bytes (4096 MiB) 00:06:12.449 Nvme3n1: 262144 blocks of 4096 bytes (1024 MiB) 00:06:12.449 00:06:12.449 00:06:12.449 CUnit - A unit testing framework for C - Version 2.1-3 00:06:12.449 http://cunit.sourceforge.net/ 00:06:12.449 00:06:12.449 00:06:12.449 Suite: bdevio tests on: Nvme3n1 00:06:12.449 Test: blockdev write read block ...passed 00:06:12.449 Test: blockdev write zeroes read block ...passed 00:06:12.449 Test: blockdev write zeroes read no split ...passed 00:06:12.449 Test: blockdev write zeroes read split ...passed 00:06:12.449 Test: blockdev write zeroes read split partial ...passed 00:06:12.449 Test: blockdev reset ...[2024-12-15 04:56:32.353783] nvme_ctrlr.c:1728:nvme_ctrlr_disconnect: *NOTICE*: [0000:00:13.0, 0] resetting controller 00:06:12.449 passed 00:06:12.449 Test: blockdev write read 8 blocks ...[2024-12-15 04:56:32.355552] bdev_nvme.c:2287:bdev_nvme_reset_ctrlr_complete: *NOTICE*: [0000:00:13.0, 0] Resetting controller successful. 00:06:12.449 passed 00:06:12.449 Test: blockdev write read size > 128k ...passed 00:06:12.449 Test: blockdev write read invalid size ...passed 00:06:12.449 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:12.449 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:12.449 Test: blockdev write read max offset ...passed 00:06:12.449 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:12.449 Test: blockdev writev readv 8 blocks ...passed 00:06:12.449 Test: blockdev writev readv 30 x 1block ...passed 00:06:12.449 Test: blockdev writev readv block ...passed 00:06:12.449 Test: blockdev writev readv size > 128k ...passed 00:06:12.449 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:12.449 Test: blockdev comparev and writev ...[2024-12-15 04:56:32.359449] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:190 nsid:1 lba:0 len:1 SGL DATA BLOCK ADDRESS 0x2d020e000 len:0x1000 00:06:12.449 [2024-12-15 04:56:32.359496] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:190 cdw0:0 sqhd:0018 p:1 m:0 dnr:1 00:06:12.449 passed 00:06:12.449 Test: blockdev nvme passthru rw ...passed 00:06:12.449 Test: blockdev nvme passthru vendor specific ...passed 00:06:12.450 Test: blockdev nvme admin passthru ...[2024-12-15 04:56:32.359977] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC CONNECT qid:1 cid:190 PRP1 0x0 PRP2 0x0 00:06:12.450 [2024-12-15 04:56:32.360009] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:190 cdw0:0 sqhd:001c p:1 m:0 dnr:1 00:06:12.450 passed 00:06:12.450 Test: blockdev copy ...passed 00:06:12.450 Suite: bdevio tests on: Nvme2n3 00:06:12.450 Test: blockdev write read block ...passed 00:06:12.450 Test: blockdev write zeroes read block ...passed 00:06:12.450 Test: blockdev write zeroes read no split ...passed 00:06:12.450 Test: blockdev write zeroes read split ...passed 00:06:12.450 Test: blockdev write zeroes read split partial ...passed 00:06:12.450 Test: blockdev reset ...[2024-12-15 04:56:32.374799] nvme_ctrlr.c:1728:nvme_ctrlr_disconnect: *NOTICE*: [0000:00:12.0, 0] resetting controller 00:06:12.450 [2024-12-15 04:56:32.376620] bdev_nvme.c:2287:bdev_nvme_reset_ctrlr_complete: *NOTICE*: [0000:00:12.0, 0] Resetting controller spassed 00:06:12.450 Test: blockdev write read 8 blocks ...passed 00:06:12.450 Test: blockdev write read size > 128k ...uccessful. 00:06:12.450 passed 00:06:12.450 Test: blockdev write read invalid size ...passed 00:06:12.450 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:12.450 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:12.450 Test: blockdev write read max offset ...passed 00:06:12.450 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:12.450 Test: blockdev writev readv 8 blocks ...passed 00:06:12.450 Test: blockdev writev readv 30 x 1block ...passed 00:06:12.450 Test: blockdev writev readv block ...passed 00:06:12.450 Test: blockdev writev readv size > 128k ...passed 00:06:12.450 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:12.450 Test: blockdev comparev and writev ...[2024-12-15 04:56:32.380472] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:190 nsid:3 lba:0 len:1 SGL DATA BLOCK ADDRESS 0x2d0206000 len:0x1000 00:06:12.450 [2024-12-15 04:56:32.380510] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:190 cdw0:0 sqhd:0018 p:1 m:0 dnr:1 00:06:12.450 passed 00:06:12.450 Test: blockdev nvme passthru rw ...passed 00:06:12.450 Test: blockdev nvme passthru vendor specific ...passed 00:06:12.450 Test: blockdev nvme admin passthru ...[2024-12-15 04:56:32.380965] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC RESERVED / VENDOR SPECIFIC qid:1 cid:190 PRP1 0x0 PRP2 0x0 00:06:12.450 [2024-12-15 04:56:32.380993] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:190 cdw0:0 sqhd:001c p:1 m:0 dnr:1 00:06:12.450 passed 00:06:12.450 Test: blockdev copy ...passed 00:06:12.450 Suite: bdevio tests on: Nvme2n2 00:06:12.450 Test: blockdev write read block ...passed 00:06:12.450 Test: blockdev write zeroes read block ...passed 00:06:12.450 Test: blockdev write zeroes read no split ...passed 00:06:12.450 Test: blockdev write zeroes read split ...passed 00:06:12.450 Test: blockdev write zeroes read split partial ...passed 00:06:12.450 Test: blockdev reset ...[2024-12-15 04:56:32.394719] nvme_ctrlr.c:1728:nvme_ctrlr_disconnect: *NOTICE*: [0000:00:12.0, 0] resetting controller 00:06:12.450 [2024-12-15 04:56:32.396418] bdev_nvme.c:2287:bdev_nvme_reset_ctrlr_complete: *NOTICE*: [0000:00:12.0, 0] Resetting controller spassed 00:06:12.450 Test: blockdev write read 8 blocks ...passed 00:06:12.450 Test: blockdev write read size > 128k ...uccessful. 00:06:12.450 passed 00:06:12.450 Test: blockdev write read invalid size ...passed 00:06:12.450 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:12.450 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:12.450 Test: blockdev write read max offset ...passed 00:06:12.450 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:12.450 Test: blockdev writev readv 8 blocks ...passed 00:06:12.450 Test: blockdev writev readv 30 x 1block ...passed 00:06:12.450 Test: blockdev writev readv block ...passed 00:06:12.450 Test: blockdev writev readv size > 128k ...passed 00:06:12.450 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:12.450 Test: blockdev comparev and writev ...[2024-12-15 04:56:32.400427] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:190 nsid:2 lba:0 len:1 SGL DATA BLOCK ADDRESS 0x2d0208000 len:0x1000 00:06:12.450 [2024-12-15 04:56:32.400475] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:190 cdw0:0 sqhd:0018 p:1 m:0 dnr:1 00:06:12.450 passed 00:06:12.450 Test: blockdev nvme passthru rw ...passed 00:06:12.450 Test: blockdev nvme passthru vendor specific ...passed 00:06:12.450 Test: blockdev nvme admin passthru ...[2024-12-15 04:56:32.400919] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC RESERVED / VENDOR SPECIFIC qid:1 cid:190 PRP1 0x0 PRP2 0x0 00:06:12.450 [2024-12-15 04:56:32.400941] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:190 cdw0:0 sqhd:001c p:1 m:0 dnr:1 00:06:12.450 passed 00:06:12.450 Test: blockdev copy ...passed 00:06:12.450 Suite: bdevio tests on: Nvme2n1 00:06:12.450 Test: blockdev write read block ...passed 00:06:12.450 Test: blockdev write zeroes read block ...passed 00:06:12.450 Test: blockdev write zeroes read no split ...passed 00:06:12.450 Test: blockdev write zeroes read split ...passed 00:06:12.450 Test: blockdev write zeroes read split partial ...passed 00:06:12.450 Test: blockdev reset ...[2024-12-15 04:56:32.414204] nvme_ctrlr.c:1728:nvme_ctrlr_disconnect: *NOTICE*: [0000:00:12.0, 0] resetting controller 00:06:12.450 [2024-12-15 04:56:32.415823] bdev_nvme.c:2287:bdev_nvme_reset_ctrlr_complete: *NOTICE*: [0000:00:12.0, 0] Resetting controller spassed 00:06:12.450 Test: blockdev write read 8 blocks ...passed 00:06:12.450 Test: blockdev write read size > 128k ...uccessful. 00:06:12.450 passed 00:06:12.450 Test: blockdev write read invalid size ...passed 00:06:12.450 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:12.450 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:12.450 Test: blockdev write read max offset ...passed 00:06:12.450 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:12.450 Test: blockdev writev readv 8 blocks ...passed 00:06:12.450 Test: blockdev writev readv 30 x 1block ...passed 00:06:12.450 Test: blockdev writev readv block ...passed 00:06:12.450 Test: blockdev writev readv size > 128k ...passed 00:06:12.450 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:12.450 Test: blockdev comparev and writev ...[2024-12-15 04:56:32.420684] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:190 nsid:1 lba:0 len:1 SGL DATA BLOCK ADDRESS 0x2cfe04000 len:0x1000 00:06:12.450 [2024-12-15 04:56:32.420718] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:190 cdw0:0 sqhd:0018 p:1 m:0 dnr:1 00:06:12.450 passed 00:06:12.450 Test: blockdev nvme passthru rw ...passed 00:06:12.450 Test: blockdev nvme passthru vendor specific ...passed 00:06:12.450 Test: blockdev nvme admin passthru ...[2024-12-15 04:56:32.421096] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC CONNECT qid:1 cid:190 PRP1 0x0 PRP2 0x0 00:06:12.450 [2024-12-15 04:56:32.421113] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:190 cdw0:0 sqhd:001c p:1 m:0 dnr:1 00:06:12.450 passed 00:06:12.450 Test: blockdev copy ...passed 00:06:12.450 Suite: bdevio tests on: Nvme1n1 00:06:12.450 Test: blockdev write read block ...passed 00:06:12.450 Test: blockdev write zeroes read block ...passed 00:06:12.450 Test: blockdev write zeroes read no split ...passed 00:06:12.450 Test: blockdev write zeroes read split ...passed 00:06:12.450 Test: blockdev write zeroes read split partial ...passed 00:06:12.450 Test: blockdev reset ...[2024-12-15 04:56:32.434628] nvme_ctrlr.c:1728:nvme_ctrlr_disconnect: *NOTICE*: [0000:00:11.0, 0] resetting controller 00:06:12.450 passed 00:06:12.450 Test: blockdev write read 8 blocks ...[2024-12-15 04:56:32.435959] bdev_nvme.c:2287:bdev_nvme_reset_ctrlr_complete: *NOTICE*: [0000:00:11.0, 0] Resetting controller successful. 00:06:12.450 passed 00:06:12.450 Test: blockdev write read size > 128k ...passed 00:06:12.450 Test: blockdev write read invalid size ...passed 00:06:12.450 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:12.450 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:12.450 Test: blockdev write read max offset ...passed 00:06:12.450 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:12.450 Test: blockdev writev readv 8 blocks ...passed 00:06:12.450 Test: blockdev writev readv 30 x 1block ...passed 00:06:12.450 Test: blockdev writev readv block ...passed 00:06:12.450 Test: blockdev writev readv size > 128k ...passed 00:06:12.450 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:12.450 Test: blockdev comparev and writev ...[2024-12-15 04:56:32.439383] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:190 nsid:1 lba:0 len:1 SGL DATA BLOCK ADDRESS 0x2e723d000 len:0x1000 00:06:12.450 [2024-12-15 04:56:32.439414] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:190 cdw0:0 sqhd:0018 p:1 m:0 dnr:1 00:06:12.450 passed 00:06:12.450 Test: blockdev nvme passthru rw ...passed 00:06:12.450 Test: blockdev nvme passthru vendor specific ...passed 00:06:12.450 Test: blockdev nvme admin passthru ...[2024-12-15 04:56:32.439913] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC CONNECT qid:1 cid:190 PRP1 0x0 PRP2 0x0 00:06:12.450 [2024-12-15 04:56:32.439939] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:190 cdw0:0 sqhd:001c p:1 m:0 dnr:1 00:06:12.450 passed 00:06:12.450 Test: blockdev copy ...passed 00:06:12.450 Suite: bdevio tests on: Nvme0n1 00:06:12.450 Test: blockdev write read block ...passed 00:06:12.450 Test: blockdev write zeroes read block ...passed 00:06:12.450 Test: blockdev write zeroes read no split ...passed 00:06:12.450 Test: blockdev write zeroes read split ...passed 00:06:12.450 Test: blockdev write zeroes read split partial ...passed 00:06:12.450 Test: blockdev reset ...[2024-12-15 04:56:32.455487] nvme_ctrlr.c:1728:nvme_ctrlr_disconnect: *NOTICE*: [0000:00:10.0, 0] resetting controller 00:06:12.450 passed 00:06:12.450 Test: blockdev write read 8 blocks ...[2024-12-15 04:56:32.456868] bdev_nvme.c:2287:bdev_nvme_reset_ctrlr_complete: *NOTICE*: [0000:00:10.0, 0] Resetting controller successful. 00:06:12.450 passed 00:06:12.450 Test: blockdev write read size > 128k ...passed 00:06:12.450 Test: blockdev write read invalid size ...passed 00:06:12.450 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:12.450 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:12.450 Test: blockdev write read max offset ...passed 00:06:12.450 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:12.450 Test: blockdev writev readv 8 blocks ...passed 00:06:12.450 Test: blockdev writev readv 30 x 1block ...passed 00:06:12.450 Test: blockdev writev readv block ...passed 00:06:12.450 Test: blockdev writev readv size > 128k ...passed 00:06:12.450 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:12.450 Test: blockdev comparev and writev ...[2024-12-15 04:56:32.460035] bdevio.c: 727:blockdev_comparev_and_writev: *ERROR*: skipping comparev_and_writev on bdev Nvme0n1 since it has 00:06:12.450 separate metadata which is not supported yet. 00:06:12.450 passed 00:06:12.450 Test: blockdev nvme passthru rw ...passed 00:06:12.450 Test: blockdev nvme passthru vendor specific ...passed 00:06:12.450 Test: blockdev nvme admin passthru ...[2024-12-15 04:56:32.460389] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC CONNECT qid:1 cid:191 PRP1 0x0 PRP2 0x0 00:06:12.451 [2024-12-15 04:56:32.460418] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:191 cdw0:0 sqhd:0017 p:1 m:0 dnr:1 00:06:12.451 passed 00:06:12.451 Test: blockdev copy ...passed 00:06:12.451 00:06:12.451 Run Summary: Type Total Ran Passed Failed Inactive 00:06:12.451 suites 6 6 n/a 0 0 00:06:12.451 tests 138 138 138 0 0 00:06:12.451 asserts 893 893 893 0 n/a 00:06:12.451 00:06:12.451 Elapsed time = 0.282 seconds 00:06:12.451 0 00:06:12.451 04:56:32 blockdev_nvme.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 73680 00:06:12.451 04:56:32 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@954 -- # '[' -z 73680 ']' 00:06:12.451 04:56:32 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@958 -- # kill -0 73680 00:06:12.451 04:56:32 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@959 -- # uname 00:06:12.451 04:56:32 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:12.451 04:56:32 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 73680 00:06:12.451 04:56:32 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:12.451 04:56:32 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:12.451 killing process with pid 73680 00:06:12.451 04:56:32 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@972 -- # echo 'killing process with pid 73680' 00:06:12.451 04:56:32 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@973 -- # kill 73680 00:06:12.451 04:56:32 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@978 -- # wait 73680 00:06:12.708 04:56:32 blockdev_nvme.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:06:12.708 00:06:12.708 real 0m1.288s 00:06:12.708 user 0m3.374s 00:06:12.708 sys 0m0.252s 00:06:12.708 04:56:32 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:12.708 04:56:32 blockdev_nvme.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:06:12.708 ************************************ 00:06:12.708 END TEST bdev_bounds 00:06:12.708 ************************************ 00:06:12.708 04:56:32 blockdev_nvme -- bdev/blockdev.sh@799 -- # run_test bdev_nbd nbd_function_test /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 'Nvme0n1 Nvme1n1 Nvme2n1 Nvme2n2 Nvme2n3 Nvme3n1' '' 00:06:12.708 04:56:32 blockdev_nvme -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:06:12.708 04:56:32 blockdev_nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:12.708 04:56:32 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:12.708 ************************************ 00:06:12.708 START TEST bdev_nbd 00:06:12.708 ************************************ 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@1129 -- # nbd_function_test /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 'Nvme0n1 Nvme1n1 Nvme2n1 Nvme2n2 Nvme2n3 Nvme3n1' '' 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('Nvme0n1' 'Nvme1n1' 'Nvme2n1' 'Nvme2n2' 'Nvme2n3' 'Nvme3n1') 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=6 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=6 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13') 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('Nvme0n1' 'Nvme1n1' 'Nvme2n1' 'Nvme2n2' 'Nvme2n3' 'Nvme3n1') 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=73723 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 73723 /var/tmp/spdk-nbd.sock 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@316 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json '' 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@835 -- # '[' -z 73723 ']' 00:06:12.708 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:12.708 04:56:32 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:06:12.708 [2024-12-15 04:56:32.766662] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:12.708 [2024-12-15 04:56:32.766769] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:06:12.966 [2024-12-15 04:56:32.925231] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:12.966 [2024-12-15 04:56:32.944655] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@868 -- # return 0 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Nvme0n1 Nvme1n1 Nvme2n1 Nvme2n2 Nvme2n3 Nvme3n1' 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Nvme0n1' 'Nvme1n1' 'Nvme2n1' 'Nvme2n2' 'Nvme2n3' 'Nvme3n1') 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Nvme0n1 Nvme1n1 Nvme2n1 Nvme2n2 Nvme2n3 Nvme3n1' 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Nvme0n1' 'Nvme1n1' 'Nvme2n1' 'Nvme2n2' 'Nvme2n3' 'Nvme3n1') 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 6 )) 00:06:13.530 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme0n1 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:13.787 1+0 records in 00:06:13.787 1+0 records out 00:06:13.787 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000767634 s, 5.3 MB/s 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 6 )) 00:06:13.787 04:56:33 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme1n1 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:14.045 1+0 records in 00:06:14.045 1+0 records out 00:06:14.045 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00101238 s, 4.0 MB/s 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 6 )) 00:06:14.045 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme2n1 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd2 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd2 /proc/partitions 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd2 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:14.303 1+0 records in 00:06:14.303 1+0 records out 00:06:14.303 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00139285 s, 2.9 MB/s 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 6 )) 00:06:14.303 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme2n2 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd3 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd3 /proc/partitions 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd3 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:14.560 1+0 records in 00:06:14.560 1+0 records out 00:06:14.560 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00101556 s, 4.0 MB/s 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 6 )) 00:06:14.560 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme2n3 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd4 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd4 /proc/partitions 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd4 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:14.817 1+0 records in 00:06:14.817 1+0 records out 00:06:14.817 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000807703 s, 5.1 MB/s 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 6 )) 00:06:14.817 04:56:34 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme3n1 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd5 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd5 /proc/partitions 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd5 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:15.074 1+0 records in 00:06:15.074 1+0 records out 00:06:15.074 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00102661 s, 4.0 MB/s 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 6 )) 00:06:15.074 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@118 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:15.333 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:06:15.333 { 00:06:15.333 "nbd_device": "/dev/nbd0", 00:06:15.333 "bdev_name": "Nvme0n1" 00:06:15.333 }, 00:06:15.333 { 00:06:15.333 "nbd_device": "/dev/nbd1", 00:06:15.333 "bdev_name": "Nvme1n1" 00:06:15.333 }, 00:06:15.333 { 00:06:15.333 "nbd_device": "/dev/nbd2", 00:06:15.333 "bdev_name": "Nvme2n1" 00:06:15.333 }, 00:06:15.333 { 00:06:15.333 "nbd_device": "/dev/nbd3", 00:06:15.333 "bdev_name": "Nvme2n2" 00:06:15.333 }, 00:06:15.333 { 00:06:15.333 "nbd_device": "/dev/nbd4", 00:06:15.333 "bdev_name": "Nvme2n3" 00:06:15.333 }, 00:06:15.333 { 00:06:15.333 "nbd_device": "/dev/nbd5", 00:06:15.333 "bdev_name": "Nvme3n1" 00:06:15.333 } 00:06:15.333 ]' 00:06:15.333 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:06:15.333 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:06:15.333 { 00:06:15.333 "nbd_device": "/dev/nbd0", 00:06:15.333 "bdev_name": "Nvme0n1" 00:06:15.333 }, 00:06:15.333 { 00:06:15.333 "nbd_device": "/dev/nbd1", 00:06:15.333 "bdev_name": "Nvme1n1" 00:06:15.333 }, 00:06:15.333 { 00:06:15.333 "nbd_device": "/dev/nbd2", 00:06:15.333 "bdev_name": "Nvme2n1" 00:06:15.333 }, 00:06:15.333 { 00:06:15.333 "nbd_device": "/dev/nbd3", 00:06:15.333 "bdev_name": "Nvme2n2" 00:06:15.333 }, 00:06:15.333 { 00:06:15.333 "nbd_device": "/dev/nbd4", 00:06:15.333 "bdev_name": "Nvme2n3" 00:06:15.333 }, 00:06:15.333 { 00:06:15.333 "nbd_device": "/dev/nbd5", 00:06:15.333 "bdev_name": "Nvme3n1" 00:06:15.333 } 00:06:15.333 ]' 00:06:15.333 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:06:15.333 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5' 00:06:15.333 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:15.333 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5') 00:06:15.333 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:15.333 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:06:15.333 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:15.333 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:15.593 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:06:15.852 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:06:15.852 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:06:15.852 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:06:15.852 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:15.852 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:15.852 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:06:15.852 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:15.852 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:15.852 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:15.852 04:56:35 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:06:16.109 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:06:16.109 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:06:16.109 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:06:16.109 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:16.109 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:16.109 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:06:16.109 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:16.109 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:16.109 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:16.109 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:06:16.368 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:06:16.368 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:06:16.368 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:06:16.368 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:16.368 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:16.368 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:06:16.368 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:16.368 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:16.368 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:16.368 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:06:16.628 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:06:16.628 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:06:16.628 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:06:16.628 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:16.628 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:16.628 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:06:16.629 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:16.629 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:16.629 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:16.629 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:16.629 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Nvme0n1 Nvme1n1 Nvme2n1 Nvme2n2 Nvme2n3 Nvme3n1' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13' 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Nvme0n1' 'Nvme1n1' 'Nvme2n1' 'Nvme2n2' 'Nvme2n3' 'Nvme3n1') 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13') 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Nvme0n1 Nvme1n1 Nvme2n1 Nvme2n2 Nvme2n3 Nvme3n1' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13' 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Nvme0n1' 'Nvme1n1' 'Nvme2n1' 'Nvme2n2' 'Nvme2n3' 'Nvme3n1') 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13') 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 6 )) 00:06:16.889 04:56:36 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme0n1 /dev/nbd0 00:06:17.150 /dev/nbd0 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:17.150 1+0 records in 00:06:17.150 1+0 records out 00:06:17.150 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000842998 s, 4.9 MB/s 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 6 )) 00:06:17.150 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme1n1 /dev/nbd1 00:06:17.411 /dev/nbd1 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:17.411 1+0 records in 00:06:17.411 1+0 records out 00:06:17.411 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00098998 s, 4.1 MB/s 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 6 )) 00:06:17.411 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme2n1 /dev/nbd10 00:06:17.671 /dev/nbd10 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd10 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd10 /proc/partitions 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd10 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:17.671 1+0 records in 00:06:17.671 1+0 records out 00:06:17.671 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000835466 s, 4.9 MB/s 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 6 )) 00:06:17.671 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme2n2 /dev/nbd11 00:06:17.932 /dev/nbd11 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd11 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd11 /proc/partitions 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd11 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:17.932 1+0 records in 00:06:17.932 1+0 records out 00:06:17.932 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00140946 s, 2.9 MB/s 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 6 )) 00:06:17.932 04:56:37 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme2n3 /dev/nbd12 00:06:18.193 /dev/nbd12 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd12 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd12 /proc/partitions 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd12 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:18.193 1+0 records in 00:06:18.193 1+0 records out 00:06:18.193 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00109408 s, 3.7 MB/s 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 6 )) 00:06:18.193 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme3n1 /dev/nbd13 00:06:18.454 /dev/nbd13 00:06:18.454 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:06:18.454 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:06:18.454 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd13 00:06:18.454 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:18.454 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:18.454 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:18.454 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd13 /proc/partitions 00:06:18.455 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:18.455 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:18.455 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:18.455 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd13 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:18.455 1+0 records in 00:06:18.455 1+0 records out 00:06:18.455 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000559827 s, 7.3 MB/s 00:06:18.455 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:18.455 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:18.455 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:18.455 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:18.455 04:56:38 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:18.455 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:18.455 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 6 )) 00:06:18.455 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:18.455 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:18.455 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:18.716 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:18.716 { 00:06:18.716 "nbd_device": "/dev/nbd0", 00:06:18.716 "bdev_name": "Nvme0n1" 00:06:18.716 }, 00:06:18.716 { 00:06:18.716 "nbd_device": "/dev/nbd1", 00:06:18.716 "bdev_name": "Nvme1n1" 00:06:18.716 }, 00:06:18.716 { 00:06:18.716 "nbd_device": "/dev/nbd10", 00:06:18.716 "bdev_name": "Nvme2n1" 00:06:18.716 }, 00:06:18.716 { 00:06:18.716 "nbd_device": "/dev/nbd11", 00:06:18.716 "bdev_name": "Nvme2n2" 00:06:18.717 }, 00:06:18.717 { 00:06:18.717 "nbd_device": "/dev/nbd12", 00:06:18.717 "bdev_name": "Nvme2n3" 00:06:18.717 }, 00:06:18.717 { 00:06:18.717 "nbd_device": "/dev/nbd13", 00:06:18.717 "bdev_name": "Nvme3n1" 00:06:18.717 } 00:06:18.717 ]' 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:18.717 { 00:06:18.717 "nbd_device": "/dev/nbd0", 00:06:18.717 "bdev_name": "Nvme0n1" 00:06:18.717 }, 00:06:18.717 { 00:06:18.717 "nbd_device": "/dev/nbd1", 00:06:18.717 "bdev_name": "Nvme1n1" 00:06:18.717 }, 00:06:18.717 { 00:06:18.717 "nbd_device": "/dev/nbd10", 00:06:18.717 "bdev_name": "Nvme2n1" 00:06:18.717 }, 00:06:18.717 { 00:06:18.717 "nbd_device": "/dev/nbd11", 00:06:18.717 "bdev_name": "Nvme2n2" 00:06:18.717 }, 00:06:18.717 { 00:06:18.717 "nbd_device": "/dev/nbd12", 00:06:18.717 "bdev_name": "Nvme2n3" 00:06:18.717 }, 00:06:18.717 { 00:06:18.717 "nbd_device": "/dev/nbd13", 00:06:18.717 "bdev_name": "Nvme3n1" 00:06:18.717 } 00:06:18.717 ]' 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:18.717 /dev/nbd1 00:06:18.717 /dev/nbd10 00:06:18.717 /dev/nbd11 00:06:18.717 /dev/nbd12 00:06:18.717 /dev/nbd13' 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:18.717 /dev/nbd1 00:06:18.717 /dev/nbd10 00:06:18.717 /dev/nbd11 00:06:18.717 /dev/nbd12 00:06:18.717 /dev/nbd13' 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=6 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 6 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=6 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 6 -ne 6 ']' 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13' write 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13') 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:06:18.717 256+0 records in 00:06:18.717 256+0 records out 00:06:18.717 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00575564 s, 182 MB/s 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:18.717 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:18.977 256+0 records in 00:06:18.977 256+0 records out 00:06:18.977 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.206165 s, 5.1 MB/s 00:06:18.977 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:18.977 04:56:38 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:19.236 256+0 records in 00:06:19.236 256+0 records out 00:06:19.236 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.246264 s, 4.3 MB/s 00:06:19.236 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:19.236 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:06:19.236 256+0 records in 00:06:19.236 256+0 records out 00:06:19.236 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.234488 s, 4.5 MB/s 00:06:19.236 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:19.237 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:06:19.495 256+0 records in 00:06:19.495 256+0 records out 00:06:19.495 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.188593 s, 5.6 MB/s 00:06:19.495 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:19.495 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:06:19.753 256+0 records in 00:06:19.753 256+0 records out 00:06:19.753 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.154724 s, 6.8 MB/s 00:06:19.753 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:19.753 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:06:20.012 256+0 records in 00:06:20.012 256+0 records out 00:06:20.012 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.193242 s, 5.4 MB/s 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13' verify 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13') 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd0 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd1 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd10 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd11 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd12 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd13 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13' 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13') 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:20.012 04:56:39 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:20.272 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:20.272 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:20.272 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:20.272 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:20.272 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:20.272 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:20.272 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:20.272 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:20.272 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:20.272 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:20.531 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:20.532 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:06:20.791 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:20.791 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:20.791 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:20.791 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:06:20.791 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:06:20.791 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:06:20.791 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:06:20.791 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:20.791 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:20.791 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:06:20.791 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:20.791 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:20.791 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:20.791 04:56:40 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:06:21.051 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:06:21.051 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:06:21.051 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:06:21.051 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:21.051 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:21.051 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:06:21.051 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:21.051 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:21.051 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:21.051 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:06:21.311 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:06:21.311 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:06:21.311 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:06:21.311 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:21.311 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:21.311 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:06:21.311 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:21.311 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:21.311 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:21.311 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:21.311 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock /dev/nbd0 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd=/dev/nbd0 00:06:21.571 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@134 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:06:21.831 malloc_lvol_verify 00:06:21.831 04:56:41 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@135 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:06:22.091 68dc5ab7-f138-4a65-9027-c28354477aa6 00:06:22.091 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@136 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:06:22.349 61d1c1d9-c97c-4868-baf3-c45c2a08350e 00:06:22.349 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@137 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:06:22.349 /dev/nbd0 00:06:22.349 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@139 -- # wait_for_nbd_set_capacity /dev/nbd0 00:06:22.349 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@146 -- # local nbd=nbd0 00:06:22.349 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@148 -- # [[ -e /sys/block/nbd0/size ]] 00:06:22.349 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@150 -- # (( 8192 == 0 )) 00:06:22.349 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs.ext4 /dev/nbd0 00:06:22.349 mke2fs 1.47.0 (5-Feb-2023) 00:06:22.349 Discarding device blocks: 0/4096 done 00:06:22.349 Creating filesystem with 4096 1k blocks and 1024 inodes 00:06:22.349 00:06:22.349 Allocating group tables: 0/1 done 00:06:22.349 Writing inode tables: 0/1 done 00:06:22.349 Creating journal (1024 blocks): done 00:06:22.350 Writing superblocks and filesystem accounting information: 0/1 done 00:06:22.350 00:06:22.350 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 73723 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@954 -- # '[' -z 73723 ']' 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@958 -- # kill -0 73723 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@959 -- # uname 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 73723 00:06:22.608 killing process with pid 73723 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@972 -- # echo 'killing process with pid 73723' 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@973 -- # kill 73723 00:06:22.608 04:56:42 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@978 -- # wait 73723 00:06:22.867 ************************************ 00:06:22.867 END TEST bdev_nbd 00:06:22.867 ************************************ 00:06:22.867 04:56:42 blockdev_nvme.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:06:22.867 00:06:22.867 real 0m10.188s 00:06:22.867 user 0m14.193s 00:06:22.867 sys 0m3.600s 00:06:22.867 04:56:42 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:22.867 04:56:42 blockdev_nvme.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:06:22.867 04:56:42 blockdev_nvme -- bdev/blockdev.sh@800 -- # [[ y == y ]] 00:06:22.867 04:56:42 blockdev_nvme -- bdev/blockdev.sh@801 -- # '[' nvme = nvme ']' 00:06:22.867 skipping fio tests on NVMe due to multi-ns failures. 00:06:22.867 04:56:42 blockdev_nvme -- bdev/blockdev.sh@803 -- # echo 'skipping fio tests on NVMe due to multi-ns failures.' 00:06:22.867 04:56:42 blockdev_nvme -- bdev/blockdev.sh@812 -- # trap cleanup SIGINT SIGTERM EXIT 00:06:22.867 04:56:42 blockdev_nvme -- bdev/blockdev.sh@814 -- # run_test bdev_verify /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:06:22.867 04:56:42 blockdev_nvme -- common/autotest_common.sh@1105 -- # '[' 16 -le 1 ']' 00:06:22.867 04:56:42 blockdev_nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:22.867 04:56:42 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:22.867 ************************************ 00:06:22.867 START TEST bdev_verify 00:06:22.867 ************************************ 00:06:22.867 04:56:42 blockdev_nvme.bdev_verify -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:06:22.867 [2024-12-15 04:56:43.004000] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:22.867 [2024-12-15 04:56:43.004123] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid74101 ] 00:06:23.127 [2024-12-15 04:56:43.165984] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:23.127 [2024-12-15 04:56:43.186542] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:23.127 [2024-12-15 04:56:43.186586] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:06:23.700 Running I/O for 5 seconds... 00:06:26.038 16512.00 IOPS, 64.50 MiB/s [2024-12-15T04:56:47.120Z] 17472.00 IOPS, 68.25 MiB/s [2024-12-15T04:56:48.063Z] 17813.33 IOPS, 69.58 MiB/s [2024-12-15T04:56:49.006Z] 18016.00 IOPS, 70.38 MiB/s [2024-12-15T04:56:49.006Z] 18316.80 IOPS, 71.55 MiB/s 00:06:28.866 Latency(us) 00:06:28.866 [2024-12-15T04:56:49.006Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:06:28.866 Job: Nvme0n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:06:28.866 Verification LBA range: start 0x0 length 0xbd0bd 00:06:28.866 Nvme0n1 : 5.04 1523.58 5.95 0.00 0.00 83715.53 17845.96 87515.77 00:06:28.866 Job: Nvme0n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:06:28.866 Verification LBA range: start 0xbd0bd length 0xbd0bd 00:06:28.866 Nvme0n1 : 5.07 1476.07 5.77 0.00 0.00 86310.73 13208.02 89128.96 00:06:28.866 Job: Nvme1n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:06:28.866 Verification LBA range: start 0x0 length 0xa0000 00:06:28.866 Nvme1n1 : 5.08 1524.44 5.95 0.00 0.00 83442.79 12149.37 87919.06 00:06:28.866 Job: Nvme1n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:06:28.866 Verification LBA range: start 0xa0000 length 0xa0000 00:06:28.866 Nvme1n1 : 5.08 1475.38 5.76 0.00 0.00 86239.26 14417.92 83079.48 00:06:28.866 Job: Nvme2n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:06:28.866 Verification LBA range: start 0x0 length 0x80000 00:06:28.866 Nvme2n1 : 5.10 1531.68 5.98 0.00 0.00 82900.47 13611.32 70577.23 00:06:28.866 Job: Nvme2n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:06:28.866 Verification LBA range: start 0x80000 length 0x80000 00:06:28.866 Nvme2n1 : 5.09 1482.64 5.79 0.00 0.00 85608.91 14317.10 68157.44 00:06:28.866 Job: Nvme2n2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:06:28.866 Verification LBA range: start 0x0 length 0x80000 00:06:28.866 Nvme2n2 : 5.10 1530.12 5.98 0.00 0.00 82814.08 17845.96 62107.96 00:06:28.867 Job: Nvme2n2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:06:28.867 Verification LBA range: start 0x80000 length 0x80000 00:06:28.867 Nvme2n2 : 5.10 1482.07 5.79 0.00 0.00 85439.06 14922.04 65334.35 00:06:28.867 Job: Nvme2n3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:06:28.867 Verification LBA range: start 0x0 length 0x80000 00:06:28.867 Nvme2n3 : 5.11 1528.87 5.97 0.00 0.00 82734.48 19862.45 63721.16 00:06:28.867 Job: Nvme2n3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:06:28.867 Verification LBA range: start 0x80000 length 0x80000 00:06:28.867 Nvme2n3 : 5.10 1481.54 5.79 0.00 0.00 85349.57 15728.64 65334.35 00:06:28.867 Job: Nvme3n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:06:28.867 Verification LBA range: start 0x0 length 0x20000 00:06:28.867 Nvme3n1 : 5.11 1528.44 5.97 0.00 0.00 82633.18 18753.38 66140.95 00:06:28.867 Job: Nvme3n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:06:28.867 Verification LBA range: start 0x20000 length 0x20000 00:06:28.867 Nvme3n1 : 5.10 1480.07 5.78 0.00 0.00 85295.28 18450.90 66947.54 00:06:28.867 [2024-12-15T04:56:49.007Z] =================================================================================================================== 00:06:28.867 [2024-12-15T04:56:49.007Z] Total : 18044.90 70.49 0.00 0.00 84350.23 12149.37 89128.96 00:06:29.439 00:06:29.439 real 0m6.405s 00:06:29.439 user 0m12.003s 00:06:29.439 sys 0m0.250s 00:06:29.439 04:56:49 blockdev_nvme.bdev_verify -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:29.439 04:56:49 blockdev_nvme.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:06:29.439 ************************************ 00:06:29.439 END TEST bdev_verify 00:06:29.439 ************************************ 00:06:29.439 04:56:49 blockdev_nvme -- bdev/blockdev.sh@815 -- # run_test bdev_verify_big_io /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:06:29.439 04:56:49 blockdev_nvme -- common/autotest_common.sh@1105 -- # '[' 16 -le 1 ']' 00:06:29.439 04:56:49 blockdev_nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:29.439 04:56:49 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:29.439 ************************************ 00:06:29.439 START TEST bdev_verify_big_io 00:06:29.439 ************************************ 00:06:29.439 04:56:49 blockdev_nvme.bdev_verify_big_io -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:06:29.439 [2024-12-15 04:56:49.488920] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:29.439 [2024-12-15 04:56:49.489088] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid74194 ] 00:06:29.700 [2024-12-15 04:56:49.655962] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:29.700 [2024-12-15 04:56:49.687580] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.700 [2024-12-15 04:56:49.687589] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:06:30.271 Running I/O for 5 seconds... 00:06:35.841 2256.00 IOPS, 141.00 MiB/s [2024-12-15T04:56:56.241Z] 2921.50 IOPS, 182.59 MiB/s [2024-12-15T04:56:56.499Z] 3128.00 IOPS, 195.50 MiB/s 00:06:36.359 Latency(us) 00:06:36.359 [2024-12-15T04:56:56.499Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:06:36.359 Job: Nvme0n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:06:36.359 Verification LBA range: start 0x0 length 0xbd0b 00:06:36.359 Nvme0n1 : 5.80 88.21 5.51 0.00 0.00 1405785.60 22584.71 1432516.14 00:06:36.359 Job: Nvme0n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:06:36.359 Verification LBA range: start 0xbd0b length 0xbd0b 00:06:36.359 Nvme0n1 : 5.51 162.58 10.16 0.00 0.00 762147.33 41539.74 871124.68 00:06:36.359 Job: Nvme1n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:06:36.359 Verification LBA range: start 0x0 length 0xa000 00:06:36.359 Nvme1n1 : 5.81 88.17 5.51 0.00 0.00 1337496.42 41338.09 1200216.22 00:06:36.359 Job: Nvme1n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:06:36.359 Verification LBA range: start 0xa000 length 0xa000 00:06:36.359 Nvme1n1 : 5.51 162.51 10.16 0.00 0.00 741195.51 108890.58 732390.01 00:06:36.359 Job: Nvme2n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:06:36.359 Verification LBA range: start 0x0 length 0x8000 00:06:36.359 Nvme2n1 : 5.85 91.36 5.71 0.00 0.00 1227769.10 36095.21 1329271.73 00:06:36.359 Job: Nvme2n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:06:36.359 Verification LBA range: start 0x8000 length 0x8000 00:06:36.359 Nvme2n1 : 5.67 169.38 10.59 0.00 0.00 697629.14 38515.00 745295.56 00:06:36.359 Job: Nvme2n2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:06:36.359 Verification LBA range: start 0x0 length 0x8000 00:06:36.359 Nvme2n2 : 5.92 108.03 6.75 0.00 0.00 1004283.59 26819.35 1329271.73 00:06:36.359 Job: Nvme2n2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:06:36.359 Verification LBA range: start 0x8000 length 0x8000 00:06:36.359 Nvme2n2 : 5.74 174.59 10.91 0.00 0.00 660567.06 27021.00 764653.88 00:06:36.359 Job: Nvme2n3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:06:36.359 Verification LBA range: start 0x0 length 0x8000 00:06:36.359 Nvme2n3 : 6.02 131.99 8.25 0.00 0.00 794678.54 11594.83 1342177.28 00:06:36.359 Job: Nvme2n3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:06:36.359 Verification LBA range: start 0x8000 length 0x8000 00:06:36.359 Nvme2n3 : 5.74 178.37 11.15 0.00 0.00 632066.56 42346.34 784012.21 00:06:36.359 Job: Nvme3n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:06:36.359 Verification LBA range: start 0x0 length 0x2000 00:06:36.359 Nvme3n1 : 6.21 231.46 14.47 0.00 0.00 436291.87 1077.56 1522854.99 00:06:36.359 Job: Nvme3n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:06:36.359 Verification LBA range: start 0x2000 length 0x2000 00:06:36.359 Nvme3n1 : 5.77 195.51 12.22 0.00 0.00 563985.02 844.41 784012.21 00:06:36.359 [2024-12-15T04:56:56.499Z] =================================================================================================================== 00:06:36.359 [2024-12-15T04:56:56.499Z] Total : 1782.15 111.38 0.00 0.00 766586.63 844.41 1522854.99 00:06:37.294 00:06:37.294 real 0m7.930s 00:06:37.294 user 0m14.983s 00:06:37.294 sys 0m0.304s 00:06:37.294 04:56:57 blockdev_nvme.bdev_verify_big_io -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:37.294 04:56:57 blockdev_nvme.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:06:37.294 ************************************ 00:06:37.294 END TEST bdev_verify_big_io 00:06:37.294 ************************************ 00:06:37.294 04:56:57 blockdev_nvme -- bdev/blockdev.sh@816 -- # run_test bdev_write_zeroes /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:06:37.294 04:56:57 blockdev_nvme -- common/autotest_common.sh@1105 -- # '[' 13 -le 1 ']' 00:06:37.294 04:56:57 blockdev_nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:37.294 04:56:57 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:37.294 ************************************ 00:06:37.294 START TEST bdev_write_zeroes 00:06:37.294 ************************************ 00:06:37.294 04:56:57 blockdev_nvme.bdev_write_zeroes -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:06:37.552 [2024-12-15 04:56:57.447840] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:37.552 [2024-12-15 04:56:57.447984] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid74299 ] 00:06:37.552 [2024-12-15 04:56:57.604801] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.552 [2024-12-15 04:56:57.624635] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.117 Running I/O for 1 seconds... 00:06:39.048 67584.00 IOPS, 264.00 MiB/s 00:06:39.048 Latency(us) 00:06:39.048 [2024-12-15T04:56:59.188Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:06:39.048 Job: Nvme0n1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:06:39.048 Nvme0n1 : 1.02 11211.48 43.79 0.00 0.00 11382.07 8822.15 26617.70 00:06:39.048 Job: Nvme1n1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:06:39.048 Nvme1n1 : 1.02 11198.84 43.75 0.00 0.00 11381.78 8721.33 26416.05 00:06:39.048 Job: Nvme2n1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:06:39.048 Nvme2n1 : 1.03 11225.01 43.85 0.00 0.00 11293.45 7007.31 20971.52 00:06:39.048 Job: Nvme2n2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:06:39.048 Nvme2n2 : 1.03 11212.44 43.80 0.00 0.00 11275.48 7561.85 19559.98 00:06:39.048 Job: Nvme2n3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:06:39.048 Nvme2n3 : 1.03 11199.81 43.75 0.00 0.00 11259.37 7965.14 19257.50 00:06:39.048 Job: Nvme3n1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:06:39.048 Nvme3n1 : 1.03 11175.69 43.66 0.00 0.00 11255.64 8015.56 19862.45 00:06:39.048 [2024-12-15T04:56:59.188Z] =================================================================================================================== 00:06:39.048 [2024-12-15T04:56:59.188Z] Total : 67223.27 262.59 0.00 0.00 11307.88 7007.31 26617.70 00:06:39.306 00:06:39.306 real 0m1.835s 00:06:39.306 user 0m1.532s 00:06:39.306 sys 0m0.192s 00:06:39.306 04:56:59 blockdev_nvme.bdev_write_zeroes -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:39.306 ************************************ 00:06:39.306 END TEST bdev_write_zeroes 00:06:39.306 ************************************ 00:06:39.306 04:56:59 blockdev_nvme.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:06:39.306 04:56:59 blockdev_nvme -- bdev/blockdev.sh@819 -- # run_test bdev_json_nonenclosed /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:06:39.306 04:56:59 blockdev_nvme -- common/autotest_common.sh@1105 -- # '[' 13 -le 1 ']' 00:06:39.306 04:56:59 blockdev_nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:39.307 04:56:59 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:39.307 ************************************ 00:06:39.307 START TEST bdev_json_nonenclosed 00:06:39.307 ************************************ 00:06:39.307 04:56:59 blockdev_nvme.bdev_json_nonenclosed -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:06:39.307 [2024-12-15 04:56:59.342918] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:39.307 [2024-12-15 04:56:59.343048] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid74340 ] 00:06:39.565 [2024-12-15 04:56:59.503994] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.565 [2024-12-15 04:56:59.523516] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.565 [2024-12-15 04:56:59.523593] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:06:39.565 [2024-12-15 04:56:59.523608] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:06:39.565 [2024-12-15 04:56:59.523621] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:39.565 00:06:39.565 real 0m0.313s 00:06:39.565 user 0m0.111s 00:06:39.565 sys 0m0.099s 00:06:39.565 04:56:59 blockdev_nvme.bdev_json_nonenclosed -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:39.565 ************************************ 00:06:39.565 END TEST bdev_json_nonenclosed 00:06:39.565 04:56:59 blockdev_nvme.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:06:39.565 ************************************ 00:06:39.565 04:56:59 blockdev_nvme -- bdev/blockdev.sh@822 -- # run_test bdev_json_nonarray /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:06:39.565 04:56:59 blockdev_nvme -- common/autotest_common.sh@1105 -- # '[' 13 -le 1 ']' 00:06:39.565 04:56:59 blockdev_nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:39.565 04:56:59 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:39.565 ************************************ 00:06:39.565 START TEST bdev_json_nonarray 00:06:39.565 ************************************ 00:06:39.565 04:56:59 blockdev_nvme.bdev_json_nonarray -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:06:39.823 [2024-12-15 04:56:59.706092] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:39.823 [2024-12-15 04:56:59.706212] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid74361 ] 00:06:39.823 [2024-12-15 04:56:59.862502] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.823 [2024-12-15 04:56:59.882116] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.823 [2024-12-15 04:56:59.882204] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:06:39.823 [2024-12-15 04:56:59.882219] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:06:39.823 [2024-12-15 04:56:59.882230] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:39.823 00:06:39.823 real 0m0.304s 00:06:39.823 user 0m0.112s 00:06:39.823 sys 0m0.090s 00:06:39.823 04:56:59 blockdev_nvme.bdev_json_nonarray -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:39.823 04:56:59 blockdev_nvme.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:06:39.823 ************************************ 00:06:39.823 END TEST bdev_json_nonarray 00:06:39.823 ************************************ 00:06:40.082 04:56:59 blockdev_nvme -- bdev/blockdev.sh@824 -- # [[ nvme == bdev ]] 00:06:40.082 04:56:59 blockdev_nvme -- bdev/blockdev.sh@832 -- # [[ nvme == gpt ]] 00:06:40.082 04:56:59 blockdev_nvme -- bdev/blockdev.sh@836 -- # [[ nvme == crypto_sw ]] 00:06:40.082 04:56:59 blockdev_nvme -- bdev/blockdev.sh@848 -- # trap - SIGINT SIGTERM EXIT 00:06:40.082 04:56:59 blockdev_nvme -- bdev/blockdev.sh@849 -- # cleanup 00:06:40.082 04:56:59 blockdev_nvme -- bdev/blockdev.sh@23 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/aiofile 00:06:40.082 04:56:59 blockdev_nvme -- bdev/blockdev.sh@24 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 00:06:40.082 04:56:59 blockdev_nvme -- bdev/blockdev.sh@26 -- # [[ nvme == rbd ]] 00:06:40.082 04:56:59 blockdev_nvme -- bdev/blockdev.sh@30 -- # [[ nvme == daos ]] 00:06:40.082 04:56:59 blockdev_nvme -- bdev/blockdev.sh@34 -- # [[ nvme = \g\p\t ]] 00:06:40.082 04:56:59 blockdev_nvme -- bdev/blockdev.sh@40 -- # [[ nvme == xnvme ]] 00:06:40.082 00:06:40.082 real 0m31.207s 00:06:40.082 user 0m48.695s 00:06:40.082 sys 0m5.593s 00:06:40.082 04:56:59 blockdev_nvme -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:40.082 04:56:59 blockdev_nvme -- common/autotest_common.sh@10 -- # set +x 00:06:40.082 ************************************ 00:06:40.082 END TEST blockdev_nvme 00:06:40.082 ************************************ 00:06:40.082 04:57:00 -- spdk/autotest.sh@209 -- # uname -s 00:06:40.082 04:57:00 -- spdk/autotest.sh@209 -- # [[ Linux == Linux ]] 00:06:40.082 04:57:00 -- spdk/autotest.sh@210 -- # run_test blockdev_nvme_gpt /home/vagrant/spdk_repo/spdk/test/bdev/blockdev.sh gpt 00:06:40.082 04:57:00 -- common/autotest_common.sh@1105 -- # '[' 3 -le 1 ']' 00:06:40.082 04:57:00 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:40.082 04:57:00 -- common/autotest_common.sh@10 -- # set +x 00:06:40.082 ************************************ 00:06:40.082 START TEST blockdev_nvme_gpt 00:06:40.082 ************************************ 00:06:40.082 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/bdev/blockdev.sh gpt 00:06:40.082 * Looking for test storage... 00:06:40.082 * Found test storage at /home/vagrant/spdk_repo/spdk/test/bdev 00:06:40.082 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:06:40.082 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@1711 -- # lcov --version 00:06:40.082 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:06:40.082 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@333 -- # local ver1 ver1_l 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@334 -- # local ver2 ver2_l 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@336 -- # IFS=.-: 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@336 -- # read -ra ver1 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@337 -- # IFS=.-: 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@337 -- # read -ra ver2 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@338 -- # local 'op=<' 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@340 -- # ver1_l=2 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@341 -- # ver2_l=1 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@344 -- # case "$op" in 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@345 -- # : 1 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@364 -- # (( v = 0 )) 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@365 -- # decimal 1 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@353 -- # local d=1 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@355 -- # echo 1 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@365 -- # ver1[v]=1 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@366 -- # decimal 2 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@353 -- # local d=2 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@355 -- # echo 2 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@366 -- # ver2[v]=2 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:06:40.082 04:57:00 blockdev_nvme_gpt -- scripts/common.sh@368 -- # return 0 00:06:40.082 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:06:40.082 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:06:40.082 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:40.082 --rc genhtml_branch_coverage=1 00:06:40.082 --rc genhtml_function_coverage=1 00:06:40.082 --rc genhtml_legend=1 00:06:40.082 --rc geninfo_all_blocks=1 00:06:40.082 --rc geninfo_unexecuted_blocks=1 00:06:40.082 00:06:40.082 ' 00:06:40.082 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:06:40.082 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:40.082 --rc genhtml_branch_coverage=1 00:06:40.082 --rc genhtml_function_coverage=1 00:06:40.082 --rc genhtml_legend=1 00:06:40.083 --rc geninfo_all_blocks=1 00:06:40.083 --rc geninfo_unexecuted_blocks=1 00:06:40.083 00:06:40.083 ' 00:06:40.083 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:06:40.083 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:40.083 --rc genhtml_branch_coverage=1 00:06:40.083 --rc genhtml_function_coverage=1 00:06:40.083 --rc genhtml_legend=1 00:06:40.083 --rc geninfo_all_blocks=1 00:06:40.083 --rc geninfo_unexecuted_blocks=1 00:06:40.083 00:06:40.083 ' 00:06:40.083 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:06:40.083 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:06:40.083 --rc genhtml_branch_coverage=1 00:06:40.083 --rc genhtml_function_coverage=1 00:06:40.083 --rc genhtml_legend=1 00:06:40.083 --rc geninfo_all_blocks=1 00:06:40.083 --rc geninfo_unexecuted_blocks=1 00:06:40.083 00:06:40.083 ' 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@10 -- # source /home/vagrant/spdk_repo/spdk/test/bdev/nbd_common.sh 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/nbd_common.sh@6 -- # set -e 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@13 -- # conf_file=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/home/vagrant/spdk_repo/spdk/test/bdev/nonenclosed.json 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/home/vagrant/spdk_repo/spdk/test/bdev/nonarray.json 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@20 -- # : 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@707 -- # QOS_DEV_1=Malloc_0 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@708 -- # QOS_DEV_2=Null_1 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@709 -- # QOS_RUN_TIME=5 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@711 -- # uname -s 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@711 -- # '[' Linux = Linux ']' 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@713 -- # PRE_RESERVED_MEM=0 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@719 -- # test_type=gpt 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@720 -- # crypto_device= 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@721 -- # dek= 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@722 -- # env_ctx= 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@723 -- # wait_for_rpc= 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@724 -- # '[' -n '' ']' 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@727 -- # [[ gpt == bdev ]] 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@727 -- # [[ gpt == crypto_* ]] 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@730 -- # start_spdk_tgt 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=74434 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@49 -- # waitforlisten 74434 00:06:40.083 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@835 -- # '[' -z 74434 ']' 00:06:40.083 04:57:00 blockdev_nvme_gpt -- bdev/blockdev.sh@46 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt '' '' 00:06:40.083 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:40.083 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:40.083 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:40.083 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:40.083 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:40.083 04:57:00 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:06:40.341 [2024-12-15 04:57:00.274236] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:40.341 [2024-12-15 04:57:00.274352] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid74434 ] 00:06:40.341 [2024-12-15 04:57:00.430215] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.341 [2024-12-15 04:57:00.453930] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:41.272 04:57:01 blockdev_nvme_gpt -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:41.272 04:57:01 blockdev_nvme_gpt -- common/autotest_common.sh@868 -- # return 0 00:06:41.272 04:57:01 blockdev_nvme_gpt -- bdev/blockdev.sh@731 -- # case "$test_type" in 00:06:41.272 04:57:01 blockdev_nvme_gpt -- bdev/blockdev.sh@739 -- # setup_gpt_conf 00:06:41.272 04:57:01 blockdev_nvme_gpt -- bdev/blockdev.sh@104 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:06:41.272 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:06:41.530 Waiting for block devices as requested 00:06:41.530 0000:00:11.0 (1b36 0010): uio_pci_generic -> nvme 00:06:41.530 0000:00:10.0 (1b36 0010): uio_pci_generic -> nvme 00:06:41.788 0000:00:12.0 (1b36 0010): uio_pci_generic -> nvme 00:06:41.788 0000:00:13.0 (1b36 0010): uio_pci_generic -> nvme 00:06:47.073 * Events for some block/disk devices (0000:00:13.0) were not caught, they may be missing 00:06:47.073 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@105 -- # get_zoned_devs 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1658 -- # zoned_ctrls=() 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1658 -- # local -A zoned_ctrls 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1659 -- # local nvme bdf ns 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1669 -- # bdf=0000:00:11.0 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1671 -- # is_block_zoned nvme0n1 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1669 -- # bdf=0000:00:10.0 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1671 -- # is_block_zoned nvme1n1 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1650 -- # local device=nvme1n1 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1669 -- # bdf=0000:00:12.0 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1671 -- # is_block_zoned nvme2n1 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1650 -- # local device=nvme2n1 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme2n1/queue/zoned ]] 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1671 -- # is_block_zoned nvme2n2 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1650 -- # local device=nvme2n2 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme2n2/queue/zoned ]] 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1671 -- # is_block_zoned nvme2n3 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1650 -- # local device=nvme2n3 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme2n3/queue/zoned ]] 00:06:47.073 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:06:47.074 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:06:47.074 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1669 -- # bdf=0000:00:13.0 00:06:47.074 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:06:47.074 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1671 -- # is_block_zoned nvme3c3n1 00:06:47.074 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1650 -- # local device=nvme3c3n1 00:06:47.074 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme3c3n1/queue/zoned ]] 00:06:47.074 04:57:06 blockdev_nvme_gpt -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@106 -- # nvme_devs=('/sys/block/nvme0n1' '/sys/block/nvme1n1' '/sys/block/nvme2n1' '/sys/block/nvme2n2' '/sys/block/nvme2n3' '/sys/block/nvme3n1') 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@106 -- # local nvme_devs nvme_dev 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@107 -- # gpt_nvme= 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@109 -- # for nvme_dev in "${nvme_devs[@]}" 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@110 -- # [[ -z '' ]] 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@111 -- # dev=/dev/nvme0n1 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@112 -- # parted /dev/nvme0n1 -ms print 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@112 -- # pt='Error: /dev/nvme0n1: unrecognised disk label 00:06:47.074 BYT; 00:06:47.074 /dev/nvme0n1:5369MB:nvme:4096:4096:unknown:QEMU NVMe Ctrl:;' 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@113 -- # [[ Error: /dev/nvme0n1: unrecognised disk label 00:06:47.074 BYT; 00:06:47.074 /dev/nvme0n1:5369MB:nvme:4096:4096:unknown:QEMU NVMe Ctrl:; == *\/\d\e\v\/\n\v\m\e\0\n\1\:\ \u\n\r\e\c\o\g\n\i\s\e\d\ \d\i\s\k\ \l\a\b\e\l* ]] 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@114 -- # gpt_nvme=/dev/nvme0n1 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@115 -- # break 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@118 -- # [[ -n /dev/nvme0n1 ]] 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@123 -- # typeset -g g_unique_partguid=6f89f330-603b-4116-ac73-2ca8eae53030 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@124 -- # typeset -g g_unique_partguid_old=abf1734f-66e5-4c0f-aa29-4021d4d307df 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@127 -- # parted -s /dev/nvme0n1 mklabel gpt mkpart SPDK_TEST_first 0% 50% mkpart SPDK_TEST_second 50% 100% 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@129 -- # get_spdk_gpt_old 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@411 -- # local spdk_guid 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@413 -- # [[ -e /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.h ]] 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@415 -- # GPT_H=/home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.h 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@416 -- # IFS='()' 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@416 -- # read -r _ spdk_guid _ 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@416 -- # grep -w SPDK_GPT_PART_TYPE_GUID_OLD /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.h 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@417 -- # spdk_guid=0x7c5222bd-0x8f5d-0x4087-0x9c00-0xbf9843c7b58c 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@417 -- # spdk_guid=7c5222bd-8f5d-4087-9c00-bf9843c7b58c 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@419 -- # echo 7c5222bd-8f5d-4087-9c00-bf9843c7b58c 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@129 -- # SPDK_GPT_OLD_GUID=7c5222bd-8f5d-4087-9c00-bf9843c7b58c 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@130 -- # get_spdk_gpt 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@423 -- # local spdk_guid 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@425 -- # [[ -e /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.h ]] 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@427 -- # GPT_H=/home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.h 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@428 -- # IFS='()' 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@428 -- # read -r _ spdk_guid _ 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@428 -- # grep -w SPDK_GPT_PART_TYPE_GUID /home/vagrant/spdk_repo/spdk/module/bdev/gpt/gpt.h 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@429 -- # spdk_guid=0x6527994e-0x2c5a-0x4eec-0x9613-0x8f5944074e8b 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@429 -- # spdk_guid=6527994e-2c5a-4eec-9613-8f5944074e8b 00:06:47.074 04:57:06 blockdev_nvme_gpt -- scripts/common.sh@431 -- # echo 6527994e-2c5a-4eec-9613-8f5944074e8b 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@130 -- # SPDK_GPT_GUID=6527994e-2c5a-4eec-9613-8f5944074e8b 00:06:47.074 04:57:06 blockdev_nvme_gpt -- bdev/blockdev.sh@131 -- # sgdisk -t 1:6527994e-2c5a-4eec-9613-8f5944074e8b -u 1:6f89f330-603b-4116-ac73-2ca8eae53030 /dev/nvme0n1 00:06:48.446 The operation has completed successfully. 00:06:48.446 04:57:08 blockdev_nvme_gpt -- bdev/blockdev.sh@132 -- # sgdisk -t 2:7c5222bd-8f5d-4087-9c00-bf9843c7b58c -u 2:abf1734f-66e5-4c0f-aa29-4021d4d307df /dev/nvme0n1 00:06:49.380 The operation has completed successfully. 00:06:49.380 04:57:09 blockdev_nvme_gpt -- bdev/blockdev.sh@133 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:06:49.945 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:06:50.202 0000:00:11.0 (1b36 0010): nvme -> uio_pci_generic 00:06:50.460 0000:00:13.0 (1b36 0010): nvme -> uio_pci_generic 00:06:50.460 0000:00:10.0 (1b36 0010): nvme -> uio_pci_generic 00:06:50.460 0000:00:12.0 (1b36 0010): nvme -> uio_pci_generic 00:06:50.460 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@134 -- # rpc_cmd bdev_get_bdevs 00:06:50.460 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:50.460 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:06:50.460 [] 00:06:50.460 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:50.460 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@135 -- # setup_nvme_conf 00:06:50.460 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@81 -- # local json 00:06:50.460 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@82 -- # mapfile -t json 00:06:50.460 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@82 -- # /home/vagrant/spdk_repo/spdk/scripts/gen_nvme.sh 00:06:50.460 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@83 -- # rpc_cmd load_subsystem_config -j ''\''{ "subsystem": "bdev", "config": [ { "method": "bdev_nvme_attach_controller", "params": { "trtype": "PCIe", "name":"Nvme0", "traddr":"0000:00:10.0" } },{ "method": "bdev_nvme_attach_controller", "params": { "trtype": "PCIe", "name":"Nvme1", "traddr":"0000:00:11.0" } },{ "method": "bdev_nvme_attach_controller", "params": { "trtype": "PCIe", "name":"Nvme2", "traddr":"0000:00:12.0" } },{ "method": "bdev_nvme_attach_controller", "params": { "trtype": "PCIe", "name":"Nvme3", "traddr":"0000:00:13.0" } } ] }'\''' 00:06:50.460 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:50.460 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:50.718 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@774 -- # rpc_cmd bdev_wait_for_examine 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:50.718 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@777 -- # cat 00:06:50.718 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@777 -- # rpc_cmd save_subsystem_config -n accel 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:50.718 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@777 -- # rpc_cmd save_subsystem_config -n bdev 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:50.718 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@777 -- # rpc_cmd save_subsystem_config -n iobuf 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:50.718 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@785 -- # mapfile -t bdevs 00:06:50.718 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@785 -- # rpc_cmd bdev_get_bdevs 00:06:50.718 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@785 -- # jq -r '.[] | select(.claimed == false)' 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@563 -- # xtrace_disable 00:06:50.718 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:06:50.977 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:06:50.977 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@786 -- # mapfile -t bdevs_name 00:06:50.977 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@786 -- # jq -r .name 00:06:50.978 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@786 -- # printf '%s\n' '{' ' "name": "Nvme0n1",' ' "aliases": [' ' "65aaf4fb-5c13-44c9-8f40-72ed12e0ec8f"' ' ],' ' "product_name": "NVMe disk",' ' "block_size": 4096,' ' "num_blocks": 1548666,' ' "uuid": "65aaf4fb-5c13-44c9-8f40-72ed12e0ec8f",' ' "numa_id": -1,' ' "md_size": 64,' ' "md_interleave": false,' ' "dif_type": 0,' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": true,' ' "nvme_io": true,' ' "nvme_io_md": true,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": true,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "nvme": [' ' {' ' "pci_address": "0000:00:10.0",' ' "trid": {' ' "trtype": "PCIe",' ' "traddr": "0000:00:10.0"' ' },' ' "ctrlr_data": {' ' "cntlid": 0,' ' "vendor_id": "0x1b36",' ' "model_number": "QEMU NVMe Ctrl",' ' "serial_number": "12340",' ' "firmware_revision": "8.0.0",' ' "subnqn": "nqn.2019-08.org.qemu:12340",' ' "oacs": {' ' "security": 0,' ' "format": 1,' ' "firmware": 0,' ' "ns_manage": 1' ' },' ' "multi_ctrlr": false,' ' "ana_reporting": false' ' },' ' "vs": {' ' "nvme_version": "1.4"' ' },' ' "ns_data": {' ' "id": 1,' ' "can_share": false' ' }' ' }' ' ],' ' "mp_policy": "active_passive"' ' }' '}' '{' ' "name": "Nvme1n1p1",' ' "aliases": [' ' "6f89f330-603b-4116-ac73-2ca8eae53030"' ' ],' ' "product_name": "GPT Disk",' ' "block_size": 4096,' ' "num_blocks": 655104,' ' "uuid": "6f89f330-603b-4116-ac73-2ca8eae53030",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": true,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "gpt": {' ' "base_bdev": "Nvme1n1",' ' "offset_blocks": 256,' ' "partition_type_guid": "6527994e-2c5a-4eec-9613-8f5944074e8b",' ' "unique_partition_guid": "6f89f330-603b-4116-ac73-2ca8eae53030",' ' "partition_name": "SPDK_TEST_first"' ' }' ' }' '}' '{' ' "name": "Nvme1n1p2",' ' "aliases": [' ' "abf1734f-66e5-4c0f-aa29-4021d4d307df"' ' ],' ' "product_name": "GPT Disk",' ' "block_size": 4096,' ' "num_blocks": 655103,' ' "uuid": "abf1734f-66e5-4c0f-aa29-4021d4d307df",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": true,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "gpt": {' ' "base_bdev": "Nvme1n1",' ' "offset_blocks": 655360,' ' "partition_type_guid": "7c5222bd-8f5d-4087-9c00-bf9843c7b58c",' ' "unique_partition_guid": "abf1734f-66e5-4c0f-aa29-4021d4d307df",' ' "partition_name": "SPDK_TEST_second"' ' }' ' }' '}' '{' ' "name": "Nvme2n1",' ' "aliases": [' ' "de60f1fb-8f73-4d8a-8450-719ef45f43cc"' ' ],' ' "product_name": "NVMe disk",' ' "block_size": 4096,' ' "num_blocks": 1048576,' ' "uuid": "de60f1fb-8f73-4d8a-8450-719ef45f43cc",' ' "numa_id": -1,' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": true,' ' "nvme_io": true,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": true,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "nvme": [' ' {' ' "pci_address": "0000:00:12.0",' ' "trid": {' ' "trtype": "PCIe",' ' "traddr": "0000:00:12.0"' ' },' ' "ctrlr_data": {' ' "cntlid": 0,' ' "vendor_id": "0x1b36",' ' "model_number": "QEMU NVMe Ctrl",' ' "serial_number": "12342",' ' "firmware_revision": "8.0.0",' ' "subnqn": "nqn.2019-08.org.qemu:12342",' ' "oacs": {' ' "security": 0,' ' "format": 1,' ' "firmware": 0,' ' "ns_manage": 1' ' },' ' "multi_ctrlr": false,' ' "ana_reporting": false' ' },' ' "vs": {' ' "nvme_version": "1.4"' ' },' ' "ns_data": {' ' "id": 1,' ' "can_share": false' ' }' ' }' ' ],' ' "mp_policy": "active_passive"' ' }' '}' '{' ' "name": "Nvme2n2",' ' "aliases": [' ' "2fb86704-cf4b-43f2-a160-32b8ddeca19a"' ' ],' ' "product_name": "NVMe disk",' ' "block_size": 4096,' ' "num_blocks": 1048576,' ' "uuid": "2fb86704-cf4b-43f2-a160-32b8ddeca19a",' ' "numa_id": -1,' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": true,' ' "nvme_io": true,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": true,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "nvme": [' ' {' ' "pci_address": "0000:00:12.0",' ' "trid": {' ' "trtype": "PCIe",' ' "traddr": "0000:00:12.0"' ' },' ' "ctrlr_data": {' ' "cntlid": 0,' ' "vendor_id": "0x1b36",' ' "model_number": "QEMU NVMe Ctrl",' ' "serial_number": "12342",' ' "firmware_revision": "8.0.0",' ' "subnqn": "nqn.2019-08.org.qemu:12342",' ' "oacs": {' ' "security": 0,' ' "format": 1,' ' "firmware": 0,' ' "ns_manage": 1' ' },' ' "multi_ctrlr": false,' ' "ana_reporting": false' ' },' ' "vs": {' ' "nvme_version": "1.4"' ' },' ' "ns_data": {' ' "id": 2,' ' "can_share": false' ' }' ' }' ' ],' ' "mp_policy": "active_passive"' ' }' '}' '{' ' "name": "Nvme2n3",' ' "aliases": [' ' "c882a404-ee6a-417b-8714-a9e6e16975f9"' ' ],' ' "product_name": "NVMe disk",' ' "block_size": 4096,' ' "num_blocks": 1048576,' ' "uuid": "c882a404-ee6a-417b-8714-a9e6e16975f9",' ' "numa_id": -1,' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": true,' ' "nvme_io": true,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": true,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "nvme": [' ' {' ' "pci_address": "0000:00:12.0",' ' "trid": {' ' "trtype": "PCIe",' ' "traddr": "0000:00:12.0"' ' },' ' "ctrlr_data": {' ' "cntlid": 0,' ' "vendor_id": "0x1b36",' ' "model_number": "QEMU NVMe Ctrl",' ' "serial_number": "12342",' ' "firmware_revision": "8.0.0",' ' "subnqn": "nqn.2019-08.org.qemu:12342",' ' "oacs": {' ' "security": 0,' ' "format": 1,' ' "firmware": 0,' ' "ns_manage": 1' ' },' ' "multi_ctrlr": false,' ' "ana_reporting": false' ' },' ' "vs": {' ' "nvme_version": "1.4"' ' },' ' "ns_data": {' ' "id": 3,' ' "can_share": false' ' }' ' }' ' ],' ' "mp_policy": "active_passive"' ' }' '}' '{' ' "name": "Nvme3n1",' ' "aliases": [' ' "2e4c0f45-7d56-4fef-83b9-87c9fa8ba77e"' ' ],' ' "product_name": "NVMe disk",' ' "block_size": 4096,' ' "num_blocks": 262144,' ' "uuid": "2e4c0f45-7d56-4fef-83b9-87c9fa8ba77e",' ' "numa_id": -1,' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": true,' ' "nvme_io": true,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": true,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "nvme": [' ' {' ' "pci_address": "0000:00:13.0",' ' "trid": {' ' "trtype": "PCIe",' ' "traddr": "0000:00:13.0"' ' },' ' "ctrlr_data": {' ' "cntlid": 0,' ' "vendor_id": "0x1b36",' ' "model_number": "QEMU NVMe Ctrl",' ' "serial_number": "12343",' ' "firmware_revision": "8.0.0",' ' "subnqn": "nqn.2019-08.org.qemu:fdp-subsys3",' ' "oacs": {' ' "security": 0,' ' "format": 1,' ' "firmware": 0,' ' "ns_manage": 1' ' },' ' "multi_ctrlr": true,' ' "ana_reporting": false' ' },' ' "vs": {' ' "nvme_version": "1.4"' ' },' ' "ns_data": {' ' "id": 1,' ' "can_share": true' ' }' ' }' ' ],' ' "mp_policy": "active_passive"' ' }' '}' 00:06:50.978 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@787 -- # bdev_list=("${bdevs_name[@]}") 00:06:50.978 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@789 -- # hello_world_bdev=Nvme0n1 00:06:50.978 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@790 -- # trap - SIGINT SIGTERM EXIT 00:06:50.978 04:57:10 blockdev_nvme_gpt -- bdev/blockdev.sh@791 -- # killprocess 74434 00:06:50.978 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@954 -- # '[' -z 74434 ']' 00:06:50.978 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@958 -- # kill -0 74434 00:06:50.978 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@959 -- # uname 00:06:50.978 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:50.978 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 74434 00:06:50.978 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:50.978 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:50.978 killing process with pid 74434 00:06:50.978 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@972 -- # echo 'killing process with pid 74434' 00:06:50.978 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@973 -- # kill 74434 00:06:50.978 04:57:10 blockdev_nvme_gpt -- common/autotest_common.sh@978 -- # wait 74434 00:06:51.238 04:57:11 blockdev_nvme_gpt -- bdev/blockdev.sh@795 -- # trap cleanup SIGINT SIGTERM EXIT 00:06:51.238 04:57:11 blockdev_nvme_gpt -- bdev/blockdev.sh@797 -- # run_test bdev_hello_world /home/vagrant/spdk_repo/spdk/build/examples/hello_bdev --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -b Nvme0n1 '' 00:06:51.238 04:57:11 blockdev_nvme_gpt -- common/autotest_common.sh@1105 -- # '[' 7 -le 1 ']' 00:06:51.238 04:57:11 blockdev_nvme_gpt -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:51.238 04:57:11 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:06:51.238 ************************************ 00:06:51.238 START TEST bdev_hello_world 00:06:51.238 ************************************ 00:06:51.238 04:57:11 blockdev_nvme_gpt.bdev_hello_world -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/hello_bdev --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -b Nvme0n1 '' 00:06:51.238 [2024-12-15 04:57:11.289542] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:51.238 [2024-12-15 04:57:11.289668] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid75052 ] 00:06:51.496 [2024-12-15 04:57:11.449835] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.496 [2024-12-15 04:57:11.470926] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.760 [2024-12-15 04:57:11.864302] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:06:51.760 [2024-12-15 04:57:11.864357] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Nvme0n1 00:06:51.760 [2024-12-15 04:57:11.864382] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:06:51.760 [2024-12-15 04:57:11.866552] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:06:51.760 [2024-12-15 04:57:11.867608] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:06:51.760 [2024-12-15 04:57:11.867641] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:06:51.760 [2024-12-15 04:57:11.868239] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:06:51.760 00:06:51.760 [2024-12-15 04:57:11.868263] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:06:52.022 00:06:52.022 real 0m0.792s 00:06:52.022 user 0m0.519s 00:06:52.022 sys 0m0.169s 00:06:52.022 04:57:12 blockdev_nvme_gpt.bdev_hello_world -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:52.022 ************************************ 00:06:52.022 END TEST bdev_hello_world 00:06:52.022 ************************************ 00:06:52.022 04:57:12 blockdev_nvme_gpt.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:06:52.022 04:57:12 blockdev_nvme_gpt -- bdev/blockdev.sh@798 -- # run_test bdev_bounds bdev_bounds '' 00:06:52.022 04:57:12 blockdev_nvme_gpt -- common/autotest_common.sh@1105 -- # '[' 3 -le 1 ']' 00:06:52.022 04:57:12 blockdev_nvme_gpt -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:52.022 04:57:12 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:06:52.022 ************************************ 00:06:52.022 START TEST bdev_bounds 00:06:52.022 ************************************ 00:06:52.022 04:57:12 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@1129 -- # bdev_bounds '' 00:06:52.022 04:57:12 blockdev_nvme_gpt.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=75083 00:06:52.022 04:57:12 blockdev_nvme_gpt.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:06:52.022 Process bdevio pid: 75083 00:06:52.022 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:52.022 04:57:12 blockdev_nvme_gpt.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 75083' 00:06:52.022 04:57:12 blockdev_nvme_gpt.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 75083 00:06:52.022 04:57:12 blockdev_nvme_gpt.bdev_bounds -- bdev/blockdev.sh@288 -- # /home/vagrant/spdk_repo/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json '' 00:06:52.022 04:57:12 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@835 -- # '[' -z 75083 ']' 00:06:52.022 04:57:12 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:52.022 04:57:12 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:52.022 04:57:12 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:52.022 04:57:12 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:52.022 04:57:12 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:06:52.022 [2024-12-15 04:57:12.136683] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:52.022 [2024-12-15 04:57:12.137084] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid75083 ] 00:06:52.280 [2024-12-15 04:57:12.297469] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:06:52.280 [2024-12-15 04:57:12.318861] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:06:52.280 [2024-12-15 04:57:12.319134] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 2 00:06:52.280 [2024-12-15 04:57:12.319189] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:53.214 04:57:12 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:53.214 04:57:12 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@868 -- # return 0 00:06:53.214 04:57:12 blockdev_nvme_gpt.bdev_bounds -- bdev/blockdev.sh@293 -- # /home/vagrant/spdk_repo/spdk/test/bdev/bdevio/tests.py perform_tests 00:06:53.214 I/O targets: 00:06:53.214 Nvme0n1: 1548666 blocks of 4096 bytes (6050 MiB) 00:06:53.214 Nvme1n1p1: 655104 blocks of 4096 bytes (2559 MiB) 00:06:53.214 Nvme1n1p2: 655103 blocks of 4096 bytes (2559 MiB) 00:06:53.214 Nvme2n1: 1048576 blocks of 4096 bytes (4096 MiB) 00:06:53.214 Nvme2n2: 1048576 blocks of 4096 bytes (4096 MiB) 00:06:53.214 Nvme2n3: 1048576 blocks of 4096 bytes (4096 MiB) 00:06:53.214 Nvme3n1: 262144 blocks of 4096 bytes (1024 MiB) 00:06:53.214 00:06:53.214 00:06:53.214 CUnit - A unit testing framework for C - Version 2.1-3 00:06:53.214 http://cunit.sourceforge.net/ 00:06:53.214 00:06:53.214 00:06:53.214 Suite: bdevio tests on: Nvme3n1 00:06:53.214 Test: blockdev write read block ...passed 00:06:53.215 Test: blockdev write zeroes read block ...passed 00:06:53.215 Test: blockdev write zeroes read no split ...passed 00:06:53.215 Test: blockdev write zeroes read split ...passed 00:06:53.215 Test: blockdev write zeroes read split partial ...passed 00:06:53.215 Test: blockdev reset ...[2024-12-15 04:57:13.106390] nvme_ctrlr.c:1728:nvme_ctrlr_disconnect: *NOTICE*: [0000:00:13.0, 0] resetting controller 00:06:53.215 passed 00:06:53.215 Test: blockdev write read 8 blocks ...[2024-12-15 04:57:13.108831] bdev_nvme.c:2287:bdev_nvme_reset_ctrlr_complete: *NOTICE*: [0000:00:13.0, 0] Resetting controller successful. 00:06:53.215 passed 00:06:53.215 Test: blockdev write read size > 128k ...passed 00:06:53.215 Test: blockdev write read invalid size ...passed 00:06:53.215 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:53.215 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:53.215 Test: blockdev write read max offset ...passed 00:06:53.215 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:53.215 Test: blockdev writev readv 8 blocks ...passed 00:06:53.215 Test: blockdev writev readv 30 x 1block ...passed 00:06:53.215 Test: blockdev writev readv block ...passed 00:06:53.215 Test: blockdev writev readv size > 128k ...passed 00:06:53.215 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:53.215 Test: blockdev comparev and writev ...[2024-12-15 04:57:13.120659] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:190 nsid:1 lba:0 len:1 SGL DATA BLOCK ADDRESS 0x2c820e000 len:0x1000 00:06:53.215 [2024-12-15 04:57:13.120729] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:190 cdw0:0 sqhd:0018 p:1 m:0 dnr:1 00:06:53.215 passed 00:06:53.215 Test: blockdev nvme passthru rw ...passed 00:06:53.215 Test: blockdev nvme passthru vendor specific ...[2024-12-15 04:57:13.121316] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC CONNECT qid:1 cid:190 PRP1 0x0 PRP2 0x0 00:06:53.215 passed 00:06:53.215 Test: blockdev nvme admin passthru ...[2024-12-15 04:57:13.121346] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:190 cdw0:0 sqhd:001c p:1 m:0 dnr:1 00:06:53.215 passed 00:06:53.215 Test: blockdev copy ...passed 00:06:53.215 Suite: bdevio tests on: Nvme2n3 00:06:53.215 Test: blockdev write read block ...passed 00:06:53.215 Test: blockdev write zeroes read block ...passed 00:06:53.215 Test: blockdev write zeroes read no split ...passed 00:06:53.215 Test: blockdev write zeroes read split ...passed 00:06:53.215 Test: blockdev write zeroes read split partial ...passed 00:06:53.215 Test: blockdev reset ...[2024-12-15 04:57:13.138816] nvme_ctrlr.c:1728:nvme_ctrlr_disconnect: *NOTICE*: [0000:00:12.0, 0] resetting controller 00:06:53.215 passed 00:06:53.215 Test: blockdev write read 8 blocks ...[2024-12-15 04:57:13.141165] bdev_nvme.c:2287:bdev_nvme_reset_ctrlr_complete: *NOTICE*: [0000:00:12.0, 0] Resetting controller successful. 00:06:53.215 passed 00:06:53.215 Test: blockdev write read size > 128k ...passed 00:06:53.215 Test: blockdev write read invalid size ...passed 00:06:53.215 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:53.215 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:53.215 Test: blockdev write read max offset ...passed 00:06:53.215 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:53.215 Test: blockdev writev readv 8 blocks ...passed 00:06:53.215 Test: blockdev writev readv 30 x 1block ...passed 00:06:53.215 Test: blockdev writev readv block ...passed 00:06:53.215 Test: blockdev writev readv size > 128k ...passed 00:06:53.215 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:53.215 Test: blockdev comparev and writev ...[2024-12-15 04:57:13.153442] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:190 nsid:3 lba:0 len:1 SGL DATA BLOCK ADDRESS 0x2c8206000 len:0x1000 00:06:53.215 [2024-12-15 04:57:13.153497] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:190 cdw0:0 sqhd:0018 p:1 m:0 dnr:1 00:06:53.215 passed 00:06:53.215 Test: blockdev nvme passthru rw ...passed 00:06:53.215 Test: blockdev nvme passthru vendor specific ...[2024-12-15 04:57:13.155858] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC RESERVED / VENDOR SPECIFIC qid:1passed 00:06:53.215 Test: blockdev nvme admin passthru ... cid:190 PRP1 0x0 PRP2 0x0 00:06:53.215 [2024-12-15 04:57:13.156012] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:190 cdw0:0 sqhd:001c p:1 m:0 dnr:1 00:06:53.215 passed 00:06:53.215 Test: blockdev copy ...passed 00:06:53.215 Suite: bdevio tests on: Nvme2n2 00:06:53.215 Test: blockdev write read block ...passed 00:06:53.215 Test: blockdev write zeroes read block ...passed 00:06:53.215 Test: blockdev write zeroes read no split ...passed 00:06:53.215 Test: blockdev write zeroes read split ...passed 00:06:53.215 Test: blockdev write zeroes read split partial ...passed 00:06:53.215 Test: blockdev reset ...[2024-12-15 04:57:13.177376] nvme_ctrlr.c:1728:nvme_ctrlr_disconnect: *NOTICE*: [0000:00:12.0, 0] resetting controller 00:06:53.215 [2024-12-15 04:57:13.180333] bdev_nvme.c:2287:bdev_nvme_reset_ctrlr_complete: *NOTICE*: [0000:00:12.0, 0] Resetting controller spasseduccessful. 00:06:53.215 00:06:53.215 Test: blockdev write read 8 blocks ...passed 00:06:53.215 Test: blockdev write read size > 128k ...passed 00:06:53.215 Test: blockdev write read invalid size ...passed 00:06:53.215 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:53.215 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:53.215 Test: blockdev write read max offset ...passed 00:06:53.215 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:53.215 Test: blockdev writev readv 8 blocks ...passed 00:06:53.215 Test: blockdev writev readv 30 x 1block ...passed 00:06:53.215 Test: blockdev writev readv block ...passed 00:06:53.215 Test: blockdev writev readv size > 128k ...passed 00:06:53.215 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:53.215 Test: blockdev comparev and writev ...[2024-12-15 04:57:13.196239] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:190 nsid:2 lba:0 len:1 SGL DATA BLOCK ADDRESS 0x2c8208000 len:0x1000 00:06:53.215 [2024-12-15 04:57:13.196286] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:190 cdw0:0 sqhd:0018 p:1 m:0 dnr:1 00:06:53.215 passed 00:06:53.215 Test: blockdev nvme passthru rw ...passed 00:06:53.215 Test: blockdev nvme passthru vendor specific ...[2024-12-15 04:57:13.198527] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC RESERVED / VENDOR SPECIFIC qid:1passed 00:06:53.215 Test: blockdev nvme admin passthru ... cid:190 PRP1 0x0 PRP2 0x0 00:06:53.215 [2024-12-15 04:57:13.198631] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:190 cdw0:0 sqhd:001c p:1 m:0 dnr:1 00:06:53.215 passed 00:06:53.215 Test: blockdev copy ...passed 00:06:53.215 Suite: bdevio tests on: Nvme2n1 00:06:53.215 Test: blockdev write read block ...passed 00:06:53.215 Test: blockdev write zeroes read block ...passed 00:06:53.215 Test: blockdev write zeroes read no split ...passed 00:06:53.215 Test: blockdev write zeroes read split ...passed 00:06:53.215 Test: blockdev write zeroes read split partial ...passed 00:06:53.215 Test: blockdev reset ...[2024-12-15 04:57:13.226771] nvme_ctrlr.c:1728:nvme_ctrlr_disconnect: *NOTICE*: [0000:00:12.0, 0] resetting controller 00:06:53.215 passed 00:06:53.215 Test: blockdev write read 8 blocks ...[2024-12-15 04:57:13.228977] bdev_nvme.c:2287:bdev_nvme_reset_ctrlr_complete: *NOTICE*: [0000:00:12.0, 0] Resetting controller successful. 00:06:53.215 passed 00:06:53.215 Test: blockdev write read size > 128k ...passed 00:06:53.215 Test: blockdev write read invalid size ...passed 00:06:53.215 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:53.215 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:53.215 Test: blockdev write read max offset ...passed 00:06:53.215 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:53.215 Test: blockdev writev readv 8 blocks ...passed 00:06:53.215 Test: blockdev writev readv 30 x 1block ...passed 00:06:53.215 Test: blockdev writev readv block ...passed 00:06:53.215 Test: blockdev writev readv size > 128k ...passed 00:06:53.215 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:53.215 Test: blockdev comparev and writev ...[2024-12-15 04:57:13.242654] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:190 nsid:1 lba:0 len:1 SGL DATA BLOCK ADDRESS 0x2e8a3d000 len:0x1000 00:06:53.215 [2024-12-15 04:57:13.242700] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:190 cdw0:0 sqhd:0018 p:1 m:0 dnr:1 00:06:53.215 passed 00:06:53.215 Test: blockdev nvme passthru rw ...passed 00:06:53.215 Test: blockdev nvme passthru vendor specific ...passed 00:06:53.215 Test: blockdev nvme admin passthru ...[2024-12-15 04:57:13.244579] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC CONNECT qid:1 cid:190 PRP1 0x0 PRP2 0x0 00:06:53.215 [2024-12-15 04:57:13.244612] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:190 cdw0:0 sqhd:001c p:1 m:0 dnr:1 00:06:53.215 passed 00:06:53.215 Test: blockdev copy ...passed 00:06:53.215 Suite: bdevio tests on: Nvme1n1p2 00:06:53.215 Test: blockdev write read block ...passed 00:06:53.215 Test: blockdev write zeroes read block ...passed 00:06:53.215 Test: blockdev write zeroes read no split ...passed 00:06:53.215 Test: blockdev write zeroes read split ...passed 00:06:53.215 Test: blockdev write zeroes read split partial ...passed 00:06:53.215 Test: blockdev reset ...[2024-12-15 04:57:13.265153] nvme_ctrlr.c:1728:nvme_ctrlr_disconnect: *NOTICE*: [0000:00:11.0, 0] resetting controller 00:06:53.215 passed 00:06:53.215 Test: blockdev write read 8 blocks ...[2024-12-15 04:57:13.267182] bdev_nvme.c:2287:bdev_nvme_reset_ctrlr_complete: *NOTICE*: [0000:00:11.0, 0] Resetting controller successful. 00:06:53.215 passed 00:06:53.215 Test: blockdev write read size > 128k ...passed 00:06:53.215 Test: blockdev write read invalid size ...passed 00:06:53.215 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:53.215 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:53.215 Test: blockdev write read max offset ...passed 00:06:53.215 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:53.215 Test: blockdev writev readv 8 blocks ...passed 00:06:53.215 Test: blockdev writev readv 30 x 1block ...passed 00:06:53.215 Test: blockdev writev readv block ...passed 00:06:53.215 Test: blockdev writev readv size > 128k ...passed 00:06:53.215 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:53.215 Test: blockdev comparev and writev ...[2024-12-15 04:57:13.281932] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:190 nsid:1 lba:655360 len:1 SGL DATA BLOCK ADDRESS 0x2e8a39000 len:0x1000 00:06:53.215 [2024-12-15 04:57:13.282056] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:190 cdw0:0 sqhd:0018 p:1 m:0 dnr:1 00:06:53.215 passed 00:06:53.215 Test: blockdev nvme passthru rw ...passed 00:06:53.215 Test: blockdev nvme passthru vendor specific ...passed 00:06:53.215 Test: blockdev nvme admin passthru ...passed 00:06:53.215 Test: blockdev copy ...passed 00:06:53.215 Suite: bdevio tests on: Nvme1n1p1 00:06:53.215 Test: blockdev write read block ...passed 00:06:53.215 Test: blockdev write zeroes read block ...passed 00:06:53.216 Test: blockdev write zeroes read no split ...passed 00:06:53.216 Test: blockdev write zeroes read split ...passed 00:06:53.216 Test: blockdev write zeroes read split partial ...passed 00:06:53.216 Test: blockdev reset ...[2024-12-15 04:57:13.300876] nvme_ctrlr.c:1728:nvme_ctrlr_disconnect: *NOTICE*: [0000:00:11.0, 0] resetting controller 00:06:53.216 passed 00:06:53.216 Test: blockdev write read 8 blocks ...[2024-12-15 04:57:13.303271] bdev_nvme.c:2287:bdev_nvme_reset_ctrlr_complete: *NOTICE*: [0000:00:11.0, 0] Resetting controller successful. 00:06:53.216 passed 00:06:53.216 Test: blockdev write read size > 128k ...passed 00:06:53.216 Test: blockdev write read invalid size ...passed 00:06:53.216 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:53.216 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:53.216 Test: blockdev write read max offset ...passed 00:06:53.216 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:53.216 Test: blockdev writev readv 8 blocks ...passed 00:06:53.216 Test: blockdev writev readv 30 x 1block ...passed 00:06:53.216 Test: blockdev writev readv block ...passed 00:06:53.216 Test: blockdev writev readv size > 128k ...passed 00:06:53.216 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:53.216 Test: blockdev comparev and writev ...[2024-12-15 04:57:13.317564] nvme_qpair.c: 243:nvme_io_qpair_print_command: *NOTICE*: COMPARE sqid:1 cid:190 nsid:1 lba:256 len:1 SGL DATA BLOCK ADDRESS 0x2e8a35000 len:0x1000 00:06:53.216 [2024-12-15 04:57:13.317600] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: COMPARE FAILURE (02/85) qid:1 cid:190 cdw0:0 sqhd:0018 p:1 m:0 dnr:1 00:06:53.216 passed 00:06:53.216 Test: blockdev nvme passthru rw ...passed 00:06:53.216 Test: blockdev nvme passthru vendor specific ...passed 00:06:53.216 Test: blockdev nvme admin passthru ...passed 00:06:53.216 Test: blockdev copy ...passed 00:06:53.216 Suite: bdevio tests on: Nvme0n1 00:06:53.216 Test: blockdev write read block ...passed 00:06:53.216 Test: blockdev write zeroes read block ...passed 00:06:53.216 Test: blockdev write zeroes read no split ...passed 00:06:53.216 Test: blockdev write zeroes read split ...passed 00:06:53.216 Test: blockdev write zeroes read split partial ...passed 00:06:53.216 Test: blockdev reset ...[2024-12-15 04:57:13.336053] nvme_ctrlr.c:1728:nvme_ctrlr_disconnect: *NOTICE*: [0000:00:10.0, 0] resetting controller 00:06:53.216 passed 00:06:53.216 Test: blockdev write read 8 blocks ...[2024-12-15 04:57:13.338506] bdev_nvme.c:2287:bdev_nvme_reset_ctrlr_complete: *NOTICE*: [0000:00:10.0, 0] Resetting controller successful. 00:06:53.216 passed 00:06:53.216 Test: blockdev write read size > 128k ...passed 00:06:53.216 Test: blockdev write read invalid size ...passed 00:06:53.216 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:06:53.216 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:06:53.216 Test: blockdev write read max offset ...passed 00:06:53.216 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:06:53.216 Test: blockdev writev readv 8 blocks ...passed 00:06:53.216 Test: blockdev writev readv 30 x 1block ...passed 00:06:53.216 Test: blockdev writev readv block ...passed 00:06:53.216 Test: blockdev writev readv size > 128k ...passed 00:06:53.216 Test: blockdev writev readv size > 128k in two iovs ...passed 00:06:53.216 Test: blockdev comparev and writev ...passed 00:06:53.216 Test: blockdev nvme passthru rw ...[2024-12-15 04:57:13.350965] bdevio.c: 727:blockdev_comparev_and_writev: *ERROR*: skipping comparev_and_writev on bdev Nvme0n1 since it has 00:06:53.216 separate metadata which is not supported yet. 00:06:53.474 passed 00:06:53.474 Test: blockdev nvme passthru vendor specific ...passed 00:06:53.474 Test: blockdev nvme admin passthru ...[2024-12-15 04:57:13.352501] nvme_qpair.c: 218:nvme_admin_qpair_print_command: *NOTICE*: FABRIC CONNECT qid:1 cid:191 PRP1 0x0 PRP2 0x0 00:06:53.474 [2024-12-15 04:57:13.352537] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:1 cid:191 cdw0:0 sqhd:0017 p:1 m:0 dnr:1 00:06:53.474 passed 00:06:53.474 Test: blockdev copy ...passed 00:06:53.474 00:06:53.474 Run Summary: Type Total Ran Passed Failed Inactive 00:06:53.474 suites 7 7 n/a 0 0 00:06:53.474 tests 161 161 161 0 0 00:06:53.474 asserts 1025 1025 1025 0 n/a 00:06:53.474 00:06:53.474 Elapsed time = 0.593 seconds 00:06:53.474 0 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 75083 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@954 -- # '[' -z 75083 ']' 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@958 -- # kill -0 75083 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@959 -- # uname 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 75083 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@972 -- # echo 'killing process with pid 75083' 00:06:53.474 killing process with pid 75083 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@973 -- # kill 75083 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@978 -- # wait 75083 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:06:53.474 00:06:53.474 real 0m1.474s 00:06:53.474 user 0m3.724s 00:06:53.474 sys 0m0.277s 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@1130 -- # xtrace_disable 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:06:53.474 ************************************ 00:06:53.474 END TEST bdev_bounds 00:06:53.474 ************************************ 00:06:53.474 04:57:13 blockdev_nvme_gpt -- bdev/blockdev.sh@799 -- # run_test bdev_nbd nbd_function_test /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 'Nvme0n1 Nvme1n1p1 Nvme1n1p2 Nvme2n1 Nvme2n2 Nvme2n3 Nvme3n1' '' 00:06:53.474 04:57:13 blockdev_nvme_gpt -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:06:53.474 04:57:13 blockdev_nvme_gpt -- common/autotest_common.sh@1111 -- # xtrace_disable 00:06:53.474 04:57:13 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:06:53.474 ************************************ 00:06:53.474 START TEST bdev_nbd 00:06:53.474 ************************************ 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@1129 -- # nbd_function_test /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 'Nvme0n1 Nvme1n1p1 Nvme1n1p2 Nvme2n1 Nvme2n2 Nvme2n3 Nvme3n1' '' 00:06:53.474 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('Nvme0n1' 'Nvme1n1p1' 'Nvme1n1p2' 'Nvme2n1' 'Nvme2n2' 'Nvme2n3' 'Nvme3n1') 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=7 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=7 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14') 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('Nvme0n1' 'Nvme1n1p1' 'Nvme1n1p2' 'Nvme2n1' 'Nvme2n2' 'Nvme2n3' 'Nvme3n1') 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=75132 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 75132 /var/tmp/spdk-nbd.sock 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@835 -- # '[' -z 75132 ']' 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@840 -- # local max_retries=100 00:06:53.733 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@844 -- # xtrace_disable 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@316 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json '' 00:06:53.733 04:57:13 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:06:53.733 [2024-12-15 04:57:13.677480] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:06:53.733 [2024-12-15 04:57:13.678004] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:06:53.733 [2024-12-15 04:57:13.835360] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.733 [2024-12-15 04:57:13.854454] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@868 -- # return 0 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Nvme0n1 Nvme1n1p1 Nvme1n1p2 Nvme2n1 Nvme2n2 Nvme2n3 Nvme3n1' 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Nvme0n1' 'Nvme1n1p1' 'Nvme1n1p2' 'Nvme2n1' 'Nvme2n2' 'Nvme2n3' 'Nvme3n1') 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Nvme0n1 Nvme1n1p1 Nvme1n1p2 Nvme2n1 Nvme2n2 Nvme2n3 Nvme3n1' 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Nvme0n1' 'Nvme1n1p1' 'Nvme1n1p2' 'Nvme2n1' 'Nvme2n2' 'Nvme2n3' 'Nvme3n1') 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 7 )) 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme0n1 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:54.668 1+0 records in 00:06:54.668 1+0 records out 00:06:54.668 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000470622 s, 8.7 MB/s 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 7 )) 00:06:54.668 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme1n1p1 00:06:54.927 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:06:54.927 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:06:54.927 04:57:14 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:06:54.927 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:06:54.927 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:54.927 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:54.927 04:57:14 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:54.927 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:06:54.927 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:54.927 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:54.927 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:54.927 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:54.927 1+0 records in 00:06:54.927 1+0 records out 00:06:54.927 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00051057 s, 8.0 MB/s 00:06:54.927 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:54.927 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:54.927 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:54.927 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:54.927 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:54.927 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:54.927 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 7 )) 00:06:54.927 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme1n1p2 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd2 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd2 /proc/partitions 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd2 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:55.184 1+0 records in 00:06:55.184 1+0 records out 00:06:55.184 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000530855 s, 7.7 MB/s 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 7 )) 00:06:55.184 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme2n1 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd3 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd3 /proc/partitions 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd3 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:55.442 1+0 records in 00:06:55.442 1+0 records out 00:06:55.442 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000639661 s, 6.4 MB/s 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 7 )) 00:06:55.442 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme2n2 00:06:55.699 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:06:55.699 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:06:55.699 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:06:55.699 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd4 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd4 /proc/partitions 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd4 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:55.700 1+0 records in 00:06:55.700 1+0 records out 00:06:55.700 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000714166 s, 5.7 MB/s 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 7 )) 00:06:55.700 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme2n3 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd5 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd5 /proc/partitions 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd5 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:55.957 1+0 records in 00:06:55.957 1+0 records out 00:06:55.957 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00102053 s, 4.0 MB/s 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:55.957 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 7 )) 00:06:55.958 04:57:15 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme3n1 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd6 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd6 /proc/partitions 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd6 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:56.216 1+0 records in 00:06:56.216 1+0 records out 00:06:56.216 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000882964 s, 4.6 MB/s 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 7 )) 00:06:56.216 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@118 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:56.473 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:06:56.473 { 00:06:56.473 "nbd_device": "/dev/nbd0", 00:06:56.473 "bdev_name": "Nvme0n1" 00:06:56.473 }, 00:06:56.473 { 00:06:56.473 "nbd_device": "/dev/nbd1", 00:06:56.473 "bdev_name": "Nvme1n1p1" 00:06:56.473 }, 00:06:56.473 { 00:06:56.473 "nbd_device": "/dev/nbd2", 00:06:56.473 "bdev_name": "Nvme1n1p2" 00:06:56.473 }, 00:06:56.473 { 00:06:56.473 "nbd_device": "/dev/nbd3", 00:06:56.473 "bdev_name": "Nvme2n1" 00:06:56.473 }, 00:06:56.473 { 00:06:56.473 "nbd_device": "/dev/nbd4", 00:06:56.473 "bdev_name": "Nvme2n2" 00:06:56.473 }, 00:06:56.473 { 00:06:56.473 "nbd_device": "/dev/nbd5", 00:06:56.473 "bdev_name": "Nvme2n3" 00:06:56.473 }, 00:06:56.473 { 00:06:56.473 "nbd_device": "/dev/nbd6", 00:06:56.473 "bdev_name": "Nvme3n1" 00:06:56.473 } 00:06:56.473 ]' 00:06:56.473 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:06:56.473 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:06:56.473 { 00:06:56.473 "nbd_device": "/dev/nbd0", 00:06:56.473 "bdev_name": "Nvme0n1" 00:06:56.473 }, 00:06:56.473 { 00:06:56.473 "nbd_device": "/dev/nbd1", 00:06:56.473 "bdev_name": "Nvme1n1p1" 00:06:56.473 }, 00:06:56.473 { 00:06:56.473 "nbd_device": "/dev/nbd2", 00:06:56.473 "bdev_name": "Nvme1n1p2" 00:06:56.473 }, 00:06:56.473 { 00:06:56.473 "nbd_device": "/dev/nbd3", 00:06:56.473 "bdev_name": "Nvme2n1" 00:06:56.473 }, 00:06:56.473 { 00:06:56.473 "nbd_device": "/dev/nbd4", 00:06:56.473 "bdev_name": "Nvme2n2" 00:06:56.474 }, 00:06:56.474 { 00:06:56.474 "nbd_device": "/dev/nbd5", 00:06:56.474 "bdev_name": "Nvme2n3" 00:06:56.474 }, 00:06:56.474 { 00:06:56.474 "nbd_device": "/dev/nbd6", 00:06:56.474 "bdev_name": "Nvme3n1" 00:06:56.474 } 00:06:56.474 ]' 00:06:56.474 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:06:56.474 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6' 00:06:56.474 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:56.474 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6') 00:06:56.474 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:56.474 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:06:56.474 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:56.474 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:56.732 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:56.732 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:56.732 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:56.732 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:56.732 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:56.732 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:56.732 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:56.732 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:56.732 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:56.732 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:56.992 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:56.992 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:56.992 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:56.992 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:56.992 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:56.992 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:56.993 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:56.993 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:56.993 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:56.993 04:57:16 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:57.254 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:06:57.514 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:06:57.514 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:06:57.514 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:06:57.514 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:57.514 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:57.514 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:06:57.514 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:57.514 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:57.514 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:57.514 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:06:57.775 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:06:57.775 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:06:57.775 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:06:57.775 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:57.775 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:57.775 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:06:57.775 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:57.775 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:57.775 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:57.775 04:57:17 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:06:58.036 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:06:58.036 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:06:58.036 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:06:58.036 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:58.036 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:58.036 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:06:58.036 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:06:58.036 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:06:58.037 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:58.037 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:58.037 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:58.297 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:58.297 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:58.297 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:58.297 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:58.297 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:58.297 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:06:58.297 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:06:58.297 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Nvme0n1 Nvme1n1p1 Nvme1n1p2 Nvme2n1 Nvme2n2 Nvme2n3 Nvme3n1' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14' 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Nvme0n1' 'Nvme1n1p1' 'Nvme1n1p2' 'Nvme2n1' 'Nvme2n2' 'Nvme2n3' 'Nvme3n1') 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14') 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Nvme0n1 Nvme1n1p1 Nvme1n1p2 Nvme2n1 Nvme2n2 Nvme2n3 Nvme3n1' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14' 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Nvme0n1' 'Nvme1n1p1' 'Nvme1n1p2' 'Nvme2n1' 'Nvme2n2' 'Nvme2n3' 'Nvme3n1') 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14') 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 7 )) 00:06:58.298 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme0n1 /dev/nbd0 00:06:58.559 /dev/nbd0 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:58.559 1+0 records in 00:06:58.559 1+0 records out 00:06:58.559 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00118586 s, 3.5 MB/s 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 7 )) 00:06:58.559 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme1n1p1 /dev/nbd1 00:06:58.817 /dev/nbd1 00:06:58.817 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:58.817 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:58.817 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:06:58.817 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:58.817 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:58.817 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:58.817 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:06:58.817 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:58.817 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:58.817 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:58.818 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:58.818 1+0 records in 00:06:58.818 1+0 records out 00:06:58.818 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0013361 s, 3.1 MB/s 00:06:58.818 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:58.818 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:58.818 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:58.818 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:58.818 04:57:18 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:58.818 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:58.818 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 7 )) 00:06:58.818 04:57:18 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme1n1p2 /dev/nbd10 00:06:59.101 /dev/nbd10 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd10 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd10 /proc/partitions 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd10 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:59.101 1+0 records in 00:06:59.101 1+0 records out 00:06:59.101 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000670306 s, 6.1 MB/s 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:59.101 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:59.102 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:59.102 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:59.102 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 7 )) 00:06:59.102 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme2n1 /dev/nbd11 00:06:59.366 /dev/nbd11 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd11 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd11 /proc/partitions 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd11 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:59.366 1+0 records in 00:06:59.366 1+0 records out 00:06:59.366 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000830506 s, 4.9 MB/s 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 7 )) 00:06:59.366 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme2n2 /dev/nbd12 00:06:59.626 /dev/nbd12 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd12 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd12 /proc/partitions 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd12 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:59.626 1+0 records in 00:06:59.626 1+0 records out 00:06:59.626 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00100179 s, 4.1 MB/s 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 7 )) 00:06:59.626 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme2n3 /dev/nbd13 00:06:59.885 /dev/nbd13 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd13 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd13 /proc/partitions 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd13 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:06:59.885 1+0 records in 00:06:59.885 1+0 records out 00:06:59.885 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000664479 s, 6.2 MB/s 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 7 )) 00:06:59.885 04:57:19 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Nvme3n1 /dev/nbd14 00:07:00.144 /dev/nbd14 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd14 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd14 /proc/partitions 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd14 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:00.144 1+0 records in 00:07:00.144 1+0 records out 00:07:00.144 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00150231 s, 2.7 MB/s 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 7 )) 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:00.144 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:00.402 { 00:07:00.402 "nbd_device": "/dev/nbd0", 00:07:00.402 "bdev_name": "Nvme0n1" 00:07:00.402 }, 00:07:00.402 { 00:07:00.402 "nbd_device": "/dev/nbd1", 00:07:00.402 "bdev_name": "Nvme1n1p1" 00:07:00.402 }, 00:07:00.402 { 00:07:00.402 "nbd_device": "/dev/nbd10", 00:07:00.402 "bdev_name": "Nvme1n1p2" 00:07:00.402 }, 00:07:00.402 { 00:07:00.402 "nbd_device": "/dev/nbd11", 00:07:00.402 "bdev_name": "Nvme2n1" 00:07:00.402 }, 00:07:00.402 { 00:07:00.402 "nbd_device": "/dev/nbd12", 00:07:00.402 "bdev_name": "Nvme2n2" 00:07:00.402 }, 00:07:00.402 { 00:07:00.402 "nbd_device": "/dev/nbd13", 00:07:00.402 "bdev_name": "Nvme2n3" 00:07:00.402 }, 00:07:00.402 { 00:07:00.402 "nbd_device": "/dev/nbd14", 00:07:00.402 "bdev_name": "Nvme3n1" 00:07:00.402 } 00:07:00.402 ]' 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:00.402 { 00:07:00.402 "nbd_device": "/dev/nbd0", 00:07:00.402 "bdev_name": "Nvme0n1" 00:07:00.402 }, 00:07:00.402 { 00:07:00.402 "nbd_device": "/dev/nbd1", 00:07:00.402 "bdev_name": "Nvme1n1p1" 00:07:00.402 }, 00:07:00.402 { 00:07:00.402 "nbd_device": "/dev/nbd10", 00:07:00.402 "bdev_name": "Nvme1n1p2" 00:07:00.402 }, 00:07:00.402 { 00:07:00.402 "nbd_device": "/dev/nbd11", 00:07:00.402 "bdev_name": "Nvme2n1" 00:07:00.402 }, 00:07:00.402 { 00:07:00.402 "nbd_device": "/dev/nbd12", 00:07:00.402 "bdev_name": "Nvme2n2" 00:07:00.402 }, 00:07:00.402 { 00:07:00.402 "nbd_device": "/dev/nbd13", 00:07:00.402 "bdev_name": "Nvme2n3" 00:07:00.402 }, 00:07:00.402 { 00:07:00.402 "nbd_device": "/dev/nbd14", 00:07:00.402 "bdev_name": "Nvme3n1" 00:07:00.402 } 00:07:00.402 ]' 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:00.402 /dev/nbd1 00:07:00.402 /dev/nbd10 00:07:00.402 /dev/nbd11 00:07:00.402 /dev/nbd12 00:07:00.402 /dev/nbd13 00:07:00.402 /dev/nbd14' 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:00.402 /dev/nbd1 00:07:00.402 /dev/nbd10 00:07:00.402 /dev/nbd11 00:07:00.402 /dev/nbd12 00:07:00.402 /dev/nbd13 00:07:00.402 /dev/nbd14' 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=7 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 7 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=7 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 7 -ne 7 ']' 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14' write 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14') 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:07:00.402 256+0 records in 00:07:00.402 256+0 records out 00:07:00.402 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00549823 s, 191 MB/s 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:00.402 256+0 records in 00:07:00.402 256+0 records out 00:07:00.402 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.12853 s, 8.2 MB/s 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:00.402 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:00.661 256+0 records in 00:07:00.661 256+0 records out 00:07:00.661 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.129137 s, 8.1 MB/s 00:07:00.661 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:00.661 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:07:00.661 256+0 records in 00:07:00.661 256+0 records out 00:07:00.661 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0936132 s, 11.2 MB/s 00:07:00.661 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:00.661 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:07:00.661 256+0 records in 00:07:00.661 256+0 records out 00:07:00.661 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0817431 s, 12.8 MB/s 00:07:00.661 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:00.661 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:07:00.920 256+0 records in 00:07:00.920 256+0 records out 00:07:00.920 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0964674 s, 10.9 MB/s 00:07:00.920 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:00.920 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:07:00.920 256+0 records in 00:07:00.920 256+0 records out 00:07:00.920 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0801012 s, 13.1 MB/s 00:07:00.920 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:00.920 04:57:20 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:07:01.180 256+0 records in 00:07:01.180 256+0 records out 00:07:01.180 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0902406 s, 11.6 MB/s 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14' verify 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14') 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd0 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd1 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd10 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd11 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd12 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd13 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd14 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14' 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14') 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:01.180 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:01.440 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:07:01.700 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:07:01.700 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:07:01.700 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:07:01.700 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:01.700 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:01.700 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:07:01.700 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:01.700 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:01.700 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:01.700 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:07:01.959 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:07:01.959 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:07:01.959 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:07:01.959 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:01.959 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:01.959 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:07:01.959 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:01.959 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:01.959 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:01.959 04:57:21 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:07:02.217 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:07:02.217 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:07:02.217 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:07:02.217 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:02.217 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:02.217 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:07:02.217 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:02.217 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:02.217 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:02.217 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:07:02.476 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:07:02.476 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:07:02.476 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:07:02.476 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:02.476 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:02.476 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:07:02.476 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:02.476 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:02.476 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:02.476 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:07:02.735 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:07:02.735 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:07:02.735 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:07:02.735 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:02.735 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:02.735 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:07:02.735 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:02.735 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:02.735 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:02.735 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:02.735 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:02.735 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:02.735 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:02.735 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:02.994 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:02.994 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:02.994 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:07:02.994 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:07:02.994 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:07:02.994 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:07:02.994 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:07:02.994 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:02.994 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:07:02.994 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock /dev/nbd0 00:07:02.994 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:02.994 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd=/dev/nbd0 00:07:02.994 04:57:22 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@134 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:07:02.994 malloc_lvol_verify 00:07:02.994 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@135 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:07:03.252 9fc36ec3-c892-4fa4-9466-0fd62eb31ce1 00:07:03.252 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@136 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:07:03.510 b7508cff-9401-4284-832d-bdc3c8907262 00:07:03.510 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@137 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:07:03.770 /dev/nbd0 00:07:03.770 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@139 -- # wait_for_nbd_set_capacity /dev/nbd0 00:07:03.770 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@146 -- # local nbd=nbd0 00:07:03.770 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@148 -- # [[ -e /sys/block/nbd0/size ]] 00:07:03.770 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@150 -- # (( 8192 == 0 )) 00:07:03.770 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs.ext4 /dev/nbd0 00:07:03.770 mke2fs 1.47.0 (5-Feb-2023) 00:07:03.770 Discarding device blocks: 0/4096 done 00:07:03.770 Creating filesystem with 4096 1k blocks and 1024 inodes 00:07:03.770 00:07:03.770 Allocating group tables: 0/1 done 00:07:03.770 Writing inode tables: 0/1 done 00:07:03.770 Creating journal (1024 blocks): done 00:07:03.770 Writing superblocks and filesystem accounting information: 0/1 done 00:07:03.770 00:07:03.770 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:07:03.770 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:03.770 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:07:03.770 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:03.770 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:03.770 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:03.770 04:57:23 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 75132 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@954 -- # '[' -z 75132 ']' 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@958 -- # kill -0 75132 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@959 -- # uname 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 75132 00:07:04.030 killing process with pid 75132 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@972 -- # echo 'killing process with pid 75132' 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@973 -- # kill 75132 00:07:04.030 04:57:24 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@978 -- # wait 75132 00:07:04.289 04:57:24 blockdev_nvme_gpt.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:07:04.289 00:07:04.289 real 0m10.589s 00:07:04.289 user 0m15.333s 00:07:04.289 sys 0m3.710s 00:07:04.289 04:57:24 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:04.289 04:57:24 blockdev_nvme_gpt.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:04.289 ************************************ 00:07:04.289 END TEST bdev_nbd 00:07:04.289 ************************************ 00:07:04.289 04:57:24 blockdev_nvme_gpt -- bdev/blockdev.sh@800 -- # [[ y == y ]] 00:07:04.289 04:57:24 blockdev_nvme_gpt -- bdev/blockdev.sh@801 -- # '[' gpt = nvme ']' 00:07:04.289 04:57:24 blockdev_nvme_gpt -- bdev/blockdev.sh@801 -- # '[' gpt = gpt ']' 00:07:04.289 skipping fio tests on NVMe due to multi-ns failures. 00:07:04.289 04:57:24 blockdev_nvme_gpt -- bdev/blockdev.sh@803 -- # echo 'skipping fio tests on NVMe due to multi-ns failures.' 00:07:04.289 04:57:24 blockdev_nvme_gpt -- bdev/blockdev.sh@812 -- # trap cleanup SIGINT SIGTERM EXIT 00:07:04.289 04:57:24 blockdev_nvme_gpt -- bdev/blockdev.sh@814 -- # run_test bdev_verify /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:07:04.289 04:57:24 blockdev_nvme_gpt -- common/autotest_common.sh@1105 -- # '[' 16 -le 1 ']' 00:07:04.289 04:57:24 blockdev_nvme_gpt -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:04.289 04:57:24 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:07:04.289 ************************************ 00:07:04.289 START TEST bdev_verify 00:07:04.289 ************************************ 00:07:04.289 04:57:24 blockdev_nvme_gpt.bdev_verify -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:07:04.289 [2024-12-15 04:57:24.295819] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:07:04.289 [2024-12-15 04:57:24.295948] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid75539 ] 00:07:04.547 [2024-12-15 04:57:24.453876] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:04.547 [2024-12-15 04:57:24.472914] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:07:04.547 [2024-12-15 04:57:24.472981] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.805 Running I/O for 5 seconds... 00:07:07.116 19968.00 IOPS, 78.00 MiB/s [2024-12-15T04:57:28.243Z] 20256.00 IOPS, 79.12 MiB/s [2024-12-15T04:57:29.182Z] 20544.00 IOPS, 80.25 MiB/s [2024-12-15T04:57:30.122Z] 20608.00 IOPS, 80.50 MiB/s [2024-12-15T04:57:30.122Z] 20876.80 IOPS, 81.55 MiB/s 00:07:09.982 Latency(us) 00:07:09.982 [2024-12-15T04:57:30.122Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:09.982 Job: Nvme0n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:09.982 Verification LBA range: start 0x0 length 0xbd0bd 00:07:09.982 Nvme0n1 : 5.07 1464.47 5.72 0.00 0.00 87152.84 16636.06 96791.63 00:07:09.982 Job: Nvme0n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:09.982 Verification LBA range: start 0xbd0bd length 0xbd0bd 00:07:09.982 Nvme0n1 : 5.06 1466.00 5.73 0.00 0.00 87076.55 17442.66 94775.14 00:07:09.982 Job: Nvme1n1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:09.982 Verification LBA range: start 0x0 length 0x4ff80 00:07:09.982 Nvme1n1p1 : 5.07 1463.75 5.72 0.00 0.00 86919.41 17442.66 82272.89 00:07:09.982 Job: Nvme1n1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:09.982 Verification LBA range: start 0x4ff80 length 0x4ff80 00:07:09.982 Nvme1n1p1 : 5.07 1465.56 5.72 0.00 0.00 86940.43 17543.48 84289.38 00:07:09.982 Job: Nvme1n1p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:09.982 Verification LBA range: start 0x0 length 0x4ff7f 00:07:09.982 Nvme1n1p2 : 5.07 1463.31 5.72 0.00 0.00 86713.83 16636.06 72190.42 00:07:09.982 Job: Nvme1n1p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:09.982 Verification LBA range: start 0x4ff7f length 0x4ff7f 00:07:09.982 Nvme1n1p2 : 5.07 1465.12 5.72 0.00 0.00 86780.92 17140.18 79046.50 00:07:09.982 Job: Nvme2n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:09.982 Verification LBA range: start 0x0 length 0x80000 00:07:09.982 Nvme2n1 : 5.07 1462.92 5.71 0.00 0.00 86532.32 16031.11 68560.74 00:07:09.982 Job: Nvme2n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:09.982 Verification LBA range: start 0x80000 length 0x80000 00:07:09.982 Nvme2n1 : 5.07 1464.71 5.72 0.00 0.00 86598.86 17140.18 69367.34 00:07:09.982 Job: Nvme2n2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:09.982 Verification LBA range: start 0x0 length 0x80000 00:07:09.982 Nvme2n2 : 5.08 1462.50 5.71 0.00 0.00 86331.36 15224.52 71383.83 00:07:09.982 Job: Nvme2n2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:09.982 Verification LBA range: start 0x80000 length 0x80000 00:07:09.982 Nvme2n2 : 5.07 1464.26 5.72 0.00 0.00 86402.88 16535.24 71383.83 00:07:09.982 Job: Nvme2n3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:09.982 Verification LBA range: start 0x0 length 0x80000 00:07:09.982 Nvme2n3 : 5.10 1481.61 5.79 0.00 0.00 85081.37 7158.55 73400.32 00:07:09.982 Job: Nvme2n3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:09.982 Verification LBA range: start 0x80000 length 0x80000 00:07:09.982 Nvme2n3 : 5.08 1474.40 5.76 0.00 0.00 85643.64 2331.57 73803.62 00:07:09.982 Job: Nvme3n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:07:09.982 Verification LBA range: start 0x0 length 0x20000 00:07:09.982 Nvme3n1 : 5.10 1481.22 5.79 0.00 0.00 84927.98 6604.01 74610.22 00:07:09.982 Job: Nvme3n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:07:09.982 Verification LBA range: start 0x20000 length 0x20000 00:07:09.982 Nvme3n1 : 5.09 1484.78 5.80 0.00 0.00 84864.97 4285.05 75013.51 00:07:09.982 [2024-12-15T04:57:30.122Z] =================================================================================================================== 00:07:09.982 [2024-12-15T04:57:30.122Z] Total : 20564.60 80.33 0.00 0.00 86278.12 2331.57 96791.63 00:07:10.556 00:07:10.556 real 0m6.349s 00:07:10.556 user 0m12.033s 00:07:10.556 sys 0m0.191s 00:07:10.556 04:57:30 blockdev_nvme_gpt.bdev_verify -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:10.556 ************************************ 00:07:10.556 END TEST bdev_verify 00:07:10.556 ************************************ 00:07:10.556 04:57:30 blockdev_nvme_gpt.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:07:10.556 04:57:30 blockdev_nvme_gpt -- bdev/blockdev.sh@815 -- # run_test bdev_verify_big_io /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:07:10.556 04:57:30 blockdev_nvme_gpt -- common/autotest_common.sh@1105 -- # '[' 16 -le 1 ']' 00:07:10.556 04:57:30 blockdev_nvme_gpt -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:10.556 04:57:30 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:07:10.556 ************************************ 00:07:10.556 START TEST bdev_verify_big_io 00:07:10.556 ************************************ 00:07:10.556 04:57:30 blockdev_nvme_gpt.bdev_verify_big_io -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:07:10.556 [2024-12-15 04:57:30.685184] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:07:10.556 [2024-12-15 04:57:30.685291] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid75627 ] 00:07:10.816 [2024-12-15 04:57:30.843426] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:10.816 [2024-12-15 04:57:30.869323] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:07:10.816 [2024-12-15 04:57:30.869379] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:07:11.387 Running I/O for 5 seconds... 00:07:17.477 1952.00 IOPS, 122.00 MiB/s [2024-12-15T04:57:38.183Z] 3305.00 IOPS, 206.56 MiB/s [2024-12-15T04:57:38.473Z] 3512.33 IOPS, 219.52 MiB/s 00:07:18.333 Latency(us) 00:07:18.333 [2024-12-15T04:57:38.473Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:18.333 Job: Nvme0n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:07:18.333 Verification LBA range: start 0x0 length 0xbd0b 00:07:18.333 Nvme0n1 : 5.69 113.36 7.09 0.00 0.00 1066279.24 18450.90 1251838.42 00:07:18.333 Job: Nvme0n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:07:18.333 Verification LBA range: start 0xbd0b length 0xbd0b 00:07:18.333 Nvme0n1 : 6.09 71.90 4.49 0.00 0.00 1657558.17 16535.24 2116510.33 00:07:18.333 Job: Nvme1n1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:07:18.333 Verification LBA range: start 0x0 length 0x4ff8 00:07:18.333 Nvme1n1p1 : 5.77 114.10 7.13 0.00 0.00 1035782.00 104051.00 1568024.42 00:07:18.333 Job: Nvme1n1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:07:18.333 Verification LBA range: start 0x4ff8 length 0x4ff8 00:07:18.333 Nvme1n1p1 : 6.18 79.43 4.96 0.00 0.00 1448130.40 51218.90 1755154.90 00:07:18.333 Job: Nvme1n1p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:07:18.333 Verification LBA range: start 0x0 length 0x4ff7 00:07:18.333 Nvme1n1p2 : 5.77 118.74 7.42 0.00 0.00 980753.86 74610.22 1400252.26 00:07:18.333 Job: Nvme1n1p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:07:18.333 Verification LBA range: start 0x4ff7 length 0x4ff7 00:07:18.333 Nvme1n1p2 : 6.18 82.48 5.15 0.00 0.00 1313270.86 36498.51 1387346.71 00:07:18.333 Job: Nvme2n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:07:18.333 Verification LBA range: start 0x0 length 0x8000 00:07:18.334 Nvme2n1 : 5.88 129.03 8.06 0.00 0.00 873174.38 80659.69 1167952.34 00:07:18.334 Job: Nvme2n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:07:18.334 Verification LBA range: start 0x8000 length 0x8000 00:07:18.334 Nvme2n1 : 6.25 91.82 5.74 0.00 0.00 1114389.00 32465.53 1309913.40 00:07:18.334 Job: Nvme2n2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:07:18.334 Verification LBA range: start 0x0 length 0x8000 00:07:18.334 Nvme2n2 : 5.99 138.81 8.68 0.00 0.00 793318.98 34482.02 1116330.14 00:07:18.334 Job: Nvme2n2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:07:18.334 Verification LBA range: start 0x8000 length 0x8000 00:07:18.334 Nvme2n2 : 6.41 116.37 7.27 0.00 0.00 847049.22 17039.36 1342177.28 00:07:18.334 Job: Nvme2n3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:07:18.334 Verification LBA range: start 0x0 length 0x8000 00:07:18.334 Nvme2n3 : 6.05 144.39 9.02 0.00 0.00 737048.59 24702.03 1142141.24 00:07:18.334 Job: Nvme2n3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:07:18.334 Verification LBA range: start 0x8000 length 0x8000 00:07:18.334 Nvme2n3 : 6.67 173.73 10.86 0.00 0.00 541107.17 15022.87 1522854.99 00:07:18.334 Job: Nvme3n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:07:18.334 Verification LBA range: start 0x0 length 0x2000 00:07:18.334 Nvme3n1 : 6.13 167.65 10.48 0.00 0.00 619143.22 649.06 1167952.34 00:07:18.334 Job: Nvme3n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:07:18.334 Verification LBA range: start 0x2000 length 0x2000 00:07:18.334 Nvme3n1 : 6.89 281.81 17.61 0.00 0.00 322263.50 586.04 1400252.26 00:07:18.334 [2024-12-15T04:57:38.474Z] =================================================================================================================== 00:07:18.334 [2024-12-15T04:57:38.474Z] Total : 1823.63 113.98 0.00 0.00 817906.62 586.04 2116510.33 00:07:20.233 00:07:20.233 real 0m9.249s 00:07:20.233 user 0m17.710s 00:07:20.233 sys 0m0.272s 00:07:20.233 04:57:39 blockdev_nvme_gpt.bdev_verify_big_io -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:20.233 04:57:39 blockdev_nvme_gpt.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:07:20.233 ************************************ 00:07:20.233 END TEST bdev_verify_big_io 00:07:20.233 ************************************ 00:07:20.233 04:57:39 blockdev_nvme_gpt -- bdev/blockdev.sh@816 -- # run_test bdev_write_zeroes /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:07:20.233 04:57:39 blockdev_nvme_gpt -- common/autotest_common.sh@1105 -- # '[' 13 -le 1 ']' 00:07:20.233 04:57:39 blockdev_nvme_gpt -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:20.233 04:57:39 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:07:20.233 ************************************ 00:07:20.233 START TEST bdev_write_zeroes 00:07:20.233 ************************************ 00:07:20.233 04:57:39 blockdev_nvme_gpt.bdev_write_zeroes -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:07:20.233 [2024-12-15 04:57:39.964966] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:07:20.233 [2024-12-15 04:57:39.965067] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid75743 ] 00:07:20.233 [2024-12-15 04:57:40.114298] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.233 [2024-12-15 04:57:40.136796] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:07:20.491 Running I/O for 1 seconds... 00:07:21.990 20265.00 IOPS, 79.16 MiB/s 00:07:21.990 Latency(us) 00:07:21.990 [2024-12-15T04:57:42.130Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:07:21.990 Job: Nvme0n1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:21.990 Nvme0n1 : 1.44 1798.60 7.03 0.00 0.00 64424.47 6200.71 519448.42 00:07:21.990 Job: Nvme1n1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:21.990 Nvme1n1p1 : 1.27 2470.62 9.65 0.00 0.00 51680.44 11292.36 404911.66 00:07:21.990 Job: Nvme1n1p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:21.990 Nvme1n1p2 : 1.27 2379.93 9.30 0.00 0.00 53536.42 11292.36 404911.66 00:07:21.990 Job: Nvme2n1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:21.990 Nvme2n1 : 1.31 2344.18 9.16 0.00 0.00 53369.38 11292.36 404911.66 00:07:21.990 Job: Nvme2n2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:21.990 Nvme2n2 : 1.31 2342.53 9.15 0.00 0.00 53342.13 11292.36 404911.66 00:07:21.990 Job: Nvme2n3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:21.990 Nvme2n3 : 1.31 2340.86 9.14 0.00 0.00 53287.68 11241.94 398458.88 00:07:21.990 Job: Nvme3n1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:07:21.990 Nvme3n1 : 1.31 2436.40 9.52 0.00 0.00 51127.23 11292.36 393619.30 00:07:21.990 [2024-12-15T04:57:42.130Z] =================================================================================================================== 00:07:21.990 [2024-12-15T04:57:42.130Z] Total : 16113.12 62.94 0.00 0.00 54138.55 6200.71 519448.42 00:07:22.249 00:07:22.249 real 0m2.377s 00:07:22.249 user 0m2.074s 00:07:22.249 sys 0m0.194s 00:07:22.249 04:57:42 blockdev_nvme_gpt.bdev_write_zeroes -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:22.249 04:57:42 blockdev_nvme_gpt.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:07:22.249 ************************************ 00:07:22.249 END TEST bdev_write_zeroes 00:07:22.249 ************************************ 00:07:22.249 04:57:42 blockdev_nvme_gpt -- bdev/blockdev.sh@819 -- # run_test bdev_json_nonenclosed /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:07:22.249 04:57:42 blockdev_nvme_gpt -- common/autotest_common.sh@1105 -- # '[' 13 -le 1 ']' 00:07:22.249 04:57:42 blockdev_nvme_gpt -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:22.249 04:57:42 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:07:22.249 ************************************ 00:07:22.249 START TEST bdev_json_nonenclosed 00:07:22.249 ************************************ 00:07:22.249 04:57:42 blockdev_nvme_gpt.bdev_json_nonenclosed -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:07:22.506 [2024-12-15 04:57:42.402390] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:07:22.506 [2024-12-15 04:57:42.402522] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid75785 ] 00:07:22.506 [2024-12-15 04:57:42.561958] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.506 [2024-12-15 04:57:42.590283] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:07:22.506 [2024-12-15 04:57:42.590381] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:07:22.506 [2024-12-15 04:57:42.590405] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:07:22.506 [2024-12-15 04:57:42.590416] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:22.764 00:07:22.764 real 0m0.318s 00:07:22.764 user 0m0.124s 00:07:22.764 sys 0m0.091s 00:07:22.764 04:57:42 blockdev_nvme_gpt.bdev_json_nonenclosed -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:22.764 ************************************ 00:07:22.764 END TEST bdev_json_nonenclosed 00:07:22.764 ************************************ 00:07:22.764 04:57:42 blockdev_nvme_gpt.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:07:22.764 04:57:42 blockdev_nvme_gpt -- bdev/blockdev.sh@822 -- # run_test bdev_json_nonarray /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:07:22.764 04:57:42 blockdev_nvme_gpt -- common/autotest_common.sh@1105 -- # '[' 13 -le 1 ']' 00:07:22.764 04:57:42 blockdev_nvme_gpt -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:22.764 04:57:42 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:07:22.764 ************************************ 00:07:22.764 START TEST bdev_json_nonarray 00:07:22.764 ************************************ 00:07:22.764 04:57:42 blockdev_nvme_gpt.bdev_json_nonarray -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:07:22.764 [2024-12-15 04:57:42.758234] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:07:22.764 [2024-12-15 04:57:42.758350] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid75805 ] 00:07:23.023 [2024-12-15 04:57:42.917679] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.023 [2024-12-15 04:57:42.940706] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.023 [2024-12-15 04:57:42.940800] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:07:23.023 [2024-12-15 04:57:42.940817] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:07:23.023 [2024-12-15 04:57:42.940829] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:23.023 00:07:23.023 real 0m0.304s 00:07:23.023 user 0m0.118s 00:07:23.023 sys 0m0.084s 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_json_nonarray -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:07:23.023 ************************************ 00:07:23.023 END TEST bdev_json_nonarray 00:07:23.023 ************************************ 00:07:23.023 04:57:43 blockdev_nvme_gpt -- bdev/blockdev.sh@824 -- # [[ gpt == bdev ]] 00:07:23.023 04:57:43 blockdev_nvme_gpt -- bdev/blockdev.sh@832 -- # [[ gpt == gpt ]] 00:07:23.023 04:57:43 blockdev_nvme_gpt -- bdev/blockdev.sh@833 -- # run_test bdev_gpt_uuid bdev_gpt_uuid 00:07:23.023 04:57:43 blockdev_nvme_gpt -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:23.023 04:57:43 blockdev_nvme_gpt -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:23.023 04:57:43 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:07:23.023 ************************************ 00:07:23.023 START TEST bdev_gpt_uuid 00:07:23.023 ************************************ 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@1129 -- # bdev_gpt_uuid 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@651 -- # local bdev 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@653 -- # start_spdk_tgt 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=75836 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@49 -- # waitforlisten 75836 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@46 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt '' '' 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@835 -- # '[' -z 75836 ']' 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@840 -- # local max_retries=100 00:07:23.023 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@844 -- # xtrace_disable 00:07:23.023 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@10 -- # set +x 00:07:23.023 [2024-12-15 04:57:43.122641] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:07:23.023 [2024-12-15 04:57:43.122763] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid75836 ] 00:07:23.281 [2024-12-15 04:57:43.282794] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:23.281 [2024-12-15 04:57:43.306593] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.847 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:07:23.847 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@868 -- # return 0 00:07:23.847 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@655 -- # rpc_cmd load_config -j /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 00:07:23.847 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:23.847 04:57:43 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@10 -- # set +x 00:07:24.412 Some configs were skipped because the RPC state that can call them passed over. 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@656 -- # rpc_cmd bdev_wait_for_examine 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@10 -- # set +x 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_get_bdevs -b 6f89f330-603b-4116-ac73-2ca8eae53030 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@10 -- # set +x 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@658 -- # bdev='[ 00:07:24.412 { 00:07:24.412 "name": "Nvme1n1p1", 00:07:24.412 "aliases": [ 00:07:24.412 "6f89f330-603b-4116-ac73-2ca8eae53030" 00:07:24.412 ], 00:07:24.412 "product_name": "GPT Disk", 00:07:24.412 "block_size": 4096, 00:07:24.412 "num_blocks": 655104, 00:07:24.412 "uuid": "6f89f330-603b-4116-ac73-2ca8eae53030", 00:07:24.412 "assigned_rate_limits": { 00:07:24.412 "rw_ios_per_sec": 0, 00:07:24.412 "rw_mbytes_per_sec": 0, 00:07:24.412 "r_mbytes_per_sec": 0, 00:07:24.412 "w_mbytes_per_sec": 0 00:07:24.412 }, 00:07:24.412 "claimed": false, 00:07:24.412 "zoned": false, 00:07:24.412 "supported_io_types": { 00:07:24.412 "read": true, 00:07:24.412 "write": true, 00:07:24.412 "unmap": true, 00:07:24.412 "flush": true, 00:07:24.412 "reset": true, 00:07:24.412 "nvme_admin": false, 00:07:24.412 "nvme_io": false, 00:07:24.412 "nvme_io_md": false, 00:07:24.412 "write_zeroes": true, 00:07:24.412 "zcopy": false, 00:07:24.412 "get_zone_info": false, 00:07:24.412 "zone_management": false, 00:07:24.412 "zone_append": false, 00:07:24.412 "compare": true, 00:07:24.412 "compare_and_write": false, 00:07:24.412 "abort": true, 00:07:24.412 "seek_hole": false, 00:07:24.412 "seek_data": false, 00:07:24.412 "copy": true, 00:07:24.412 "nvme_iov_md": false 00:07:24.412 }, 00:07:24.412 "driver_specific": { 00:07:24.412 "gpt": { 00:07:24.412 "base_bdev": "Nvme1n1", 00:07:24.412 "offset_blocks": 256, 00:07:24.412 "partition_type_guid": "6527994e-2c5a-4eec-9613-8f5944074e8b", 00:07:24.412 "unique_partition_guid": "6f89f330-603b-4116-ac73-2ca8eae53030", 00:07:24.412 "partition_name": "SPDK_TEST_first" 00:07:24.412 } 00:07:24.412 } 00:07:24.412 } 00:07:24.412 ]' 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@659 -- # jq -r length 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@659 -- # [[ 1 == \1 ]] 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@660 -- # jq -r '.[0].aliases[0]' 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@660 -- # [[ 6f89f330-603b-4116-ac73-2ca8eae53030 == \6\f\8\9\f\3\3\0\-\6\0\3\b\-\4\1\1\6\-\a\c\7\3\-\2\c\a\8\e\a\e\5\3\0\3\0 ]] 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@661 -- # jq -r '.[0].driver_specific.gpt.unique_partition_guid' 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@661 -- # [[ 6f89f330-603b-4116-ac73-2ca8eae53030 == \6\f\8\9\f\3\3\0\-\6\0\3\b\-\4\1\1\6\-\a\c\7\3\-\2\c\a\8\e\a\e\5\3\0\3\0 ]] 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@663 -- # rpc_cmd bdev_get_bdevs -b abf1734f-66e5-4c0f-aa29-4021d4d307df 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@563 -- # xtrace_disable 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@10 -- # set +x 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:07:24.412 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@663 -- # bdev='[ 00:07:24.412 { 00:07:24.412 "name": "Nvme1n1p2", 00:07:24.412 "aliases": [ 00:07:24.412 "abf1734f-66e5-4c0f-aa29-4021d4d307df" 00:07:24.412 ], 00:07:24.412 "product_name": "GPT Disk", 00:07:24.412 "block_size": 4096, 00:07:24.412 "num_blocks": 655103, 00:07:24.412 "uuid": "abf1734f-66e5-4c0f-aa29-4021d4d307df", 00:07:24.412 "assigned_rate_limits": { 00:07:24.412 "rw_ios_per_sec": 0, 00:07:24.412 "rw_mbytes_per_sec": 0, 00:07:24.412 "r_mbytes_per_sec": 0, 00:07:24.412 "w_mbytes_per_sec": 0 00:07:24.412 }, 00:07:24.412 "claimed": false, 00:07:24.412 "zoned": false, 00:07:24.412 "supported_io_types": { 00:07:24.412 "read": true, 00:07:24.412 "write": true, 00:07:24.412 "unmap": true, 00:07:24.412 "flush": true, 00:07:24.412 "reset": true, 00:07:24.412 "nvme_admin": false, 00:07:24.412 "nvme_io": false, 00:07:24.412 "nvme_io_md": false, 00:07:24.412 "write_zeroes": true, 00:07:24.412 "zcopy": false, 00:07:24.412 "get_zone_info": false, 00:07:24.413 "zone_management": false, 00:07:24.413 "zone_append": false, 00:07:24.413 "compare": true, 00:07:24.413 "compare_and_write": false, 00:07:24.413 "abort": true, 00:07:24.413 "seek_hole": false, 00:07:24.413 "seek_data": false, 00:07:24.413 "copy": true, 00:07:24.413 "nvme_iov_md": false 00:07:24.413 }, 00:07:24.413 "driver_specific": { 00:07:24.413 "gpt": { 00:07:24.413 "base_bdev": "Nvme1n1", 00:07:24.413 "offset_blocks": 655360, 00:07:24.413 "partition_type_guid": "7c5222bd-8f5d-4087-9c00-bf9843c7b58c", 00:07:24.413 "unique_partition_guid": "abf1734f-66e5-4c0f-aa29-4021d4d307df", 00:07:24.413 "partition_name": "SPDK_TEST_second" 00:07:24.413 } 00:07:24.413 } 00:07:24.413 } 00:07:24.413 ]' 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@664 -- # jq -r length 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@664 -- # [[ 1 == \1 ]] 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@665 -- # jq -r '.[0].aliases[0]' 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@665 -- # [[ abf1734f-66e5-4c0f-aa29-4021d4d307df == \a\b\f\1\7\3\4\f\-\6\6\e\5\-\4\c\0\f\-\a\a\2\9\-\4\0\2\1\d\4\d\3\0\7\d\f ]] 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@666 -- # jq -r '.[0].driver_specific.gpt.unique_partition_guid' 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@666 -- # [[ abf1734f-66e5-4c0f-aa29-4021d4d307df == \a\b\f\1\7\3\4\f\-\6\6\e\5\-\4\c\0\f\-\a\a\2\9\-\4\0\2\1\d\4\d\3\0\7\d\f ]] 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- bdev/blockdev.sh@668 -- # killprocess 75836 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@954 -- # '[' -z 75836 ']' 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@958 -- # kill -0 75836 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@959 -- # uname 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 75836 00:07:24.413 killing process with pid 75836 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@972 -- # echo 'killing process with pid 75836' 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@973 -- # kill 75836 00:07:24.413 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@978 -- # wait 75836 00:07:24.979 00:07:24.979 real 0m1.800s 00:07:24.979 user 0m1.922s 00:07:24.979 sys 0m0.376s 00:07:24.979 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:24.979 ************************************ 00:07:24.979 END TEST bdev_gpt_uuid 00:07:24.979 ************************************ 00:07:24.979 04:57:44 blockdev_nvme_gpt.bdev_gpt_uuid -- common/autotest_common.sh@10 -- # set +x 00:07:24.979 04:57:44 blockdev_nvme_gpt -- bdev/blockdev.sh@836 -- # [[ gpt == crypto_sw ]] 00:07:24.979 04:57:44 blockdev_nvme_gpt -- bdev/blockdev.sh@848 -- # trap - SIGINT SIGTERM EXIT 00:07:24.979 04:57:44 blockdev_nvme_gpt -- bdev/blockdev.sh@849 -- # cleanup 00:07:24.979 04:57:44 blockdev_nvme_gpt -- bdev/blockdev.sh@23 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/aiofile 00:07:24.979 04:57:44 blockdev_nvme_gpt -- bdev/blockdev.sh@24 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 00:07:24.979 04:57:44 blockdev_nvme_gpt -- bdev/blockdev.sh@26 -- # [[ gpt == rbd ]] 00:07:24.979 04:57:44 blockdev_nvme_gpt -- bdev/blockdev.sh@30 -- # [[ gpt == daos ]] 00:07:24.979 04:57:44 blockdev_nvme_gpt -- bdev/blockdev.sh@34 -- # [[ gpt = \g\p\t ]] 00:07:24.979 04:57:44 blockdev_nvme_gpt -- bdev/blockdev.sh@35 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:07:25.236 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:07:25.236 Waiting for block devices as requested 00:07:25.236 0000:00:11.0 (1b36 0010): uio_pci_generic -> nvme 00:07:25.493 0000:00:10.0 (1b36 0010): uio_pci_generic -> nvme 00:07:25.493 0000:00:12.0 (1b36 0010): uio_pci_generic -> nvme 00:07:25.493 0000:00:13.0 (1b36 0010): uio_pci_generic -> nvme 00:07:30.757 * Events for some block/disk devices (0000:00:13.0) were not caught, they may be missing 00:07:30.757 04:57:50 blockdev_nvme_gpt -- bdev/blockdev.sh@36 -- # [[ -b /dev/nvme0n1 ]] 00:07:30.757 04:57:50 blockdev_nvme_gpt -- bdev/blockdev.sh@37 -- # wipefs --all /dev/nvme0n1 00:07:31.015 /dev/nvme0n1: 8 bytes were erased at offset 0x00001000 (gpt): 45 46 49 20 50 41 52 54 00:07:31.015 /dev/nvme0n1: 8 bytes were erased at offset 0x13ffff000 (gpt): 45 46 49 20 50 41 52 54 00:07:31.015 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:07:31.015 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:07:31.015 04:57:50 blockdev_nvme_gpt -- bdev/blockdev.sh@40 -- # [[ gpt == xnvme ]] 00:07:31.015 00:07:31.015 real 0m50.916s 00:07:31.015 user 1m5.723s 00:07:31.015 sys 0m7.830s 00:07:31.015 04:57:50 blockdev_nvme_gpt -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:31.015 04:57:50 blockdev_nvme_gpt -- common/autotest_common.sh@10 -- # set +x 00:07:31.015 ************************************ 00:07:31.015 END TEST blockdev_nvme_gpt 00:07:31.015 ************************************ 00:07:31.015 04:57:50 -- spdk/autotest.sh@212 -- # run_test nvme /home/vagrant/spdk_repo/spdk/test/nvme/nvme.sh 00:07:31.015 04:57:50 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:31.015 04:57:50 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:31.015 04:57:50 -- common/autotest_common.sh@10 -- # set +x 00:07:31.016 ************************************ 00:07:31.016 START TEST nvme 00:07:31.016 ************************************ 00:07:31.016 04:57:51 nvme -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/nvme.sh 00:07:31.016 * Looking for test storage... 00:07:31.016 * Found test storage at /home/vagrant/spdk_repo/spdk/test/nvme 00:07:31.016 04:57:51 nvme -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:07:31.016 04:57:51 nvme -- common/autotest_common.sh@1711 -- # lcov --version 00:07:31.016 04:57:51 nvme -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:07:31.016 04:57:51 nvme -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:07:31.016 04:57:51 nvme -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:07:31.016 04:57:51 nvme -- scripts/common.sh@333 -- # local ver1 ver1_l 00:07:31.016 04:57:51 nvme -- scripts/common.sh@334 -- # local ver2 ver2_l 00:07:31.016 04:57:51 nvme -- scripts/common.sh@336 -- # IFS=.-: 00:07:31.016 04:57:51 nvme -- scripts/common.sh@336 -- # read -ra ver1 00:07:31.016 04:57:51 nvme -- scripts/common.sh@337 -- # IFS=.-: 00:07:31.016 04:57:51 nvme -- scripts/common.sh@337 -- # read -ra ver2 00:07:31.016 04:57:51 nvme -- scripts/common.sh@338 -- # local 'op=<' 00:07:31.016 04:57:51 nvme -- scripts/common.sh@340 -- # ver1_l=2 00:07:31.016 04:57:51 nvme -- scripts/common.sh@341 -- # ver2_l=1 00:07:31.016 04:57:51 nvme -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:07:31.016 04:57:51 nvme -- scripts/common.sh@344 -- # case "$op" in 00:07:31.016 04:57:51 nvme -- scripts/common.sh@345 -- # : 1 00:07:31.016 04:57:51 nvme -- scripts/common.sh@364 -- # (( v = 0 )) 00:07:31.016 04:57:51 nvme -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:07:31.016 04:57:51 nvme -- scripts/common.sh@365 -- # decimal 1 00:07:31.016 04:57:51 nvme -- scripts/common.sh@353 -- # local d=1 00:07:31.016 04:57:51 nvme -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:07:31.016 04:57:51 nvme -- scripts/common.sh@355 -- # echo 1 00:07:31.016 04:57:51 nvme -- scripts/common.sh@365 -- # ver1[v]=1 00:07:31.016 04:57:51 nvme -- scripts/common.sh@366 -- # decimal 2 00:07:31.016 04:57:51 nvme -- scripts/common.sh@353 -- # local d=2 00:07:31.016 04:57:51 nvme -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:07:31.016 04:57:51 nvme -- scripts/common.sh@355 -- # echo 2 00:07:31.016 04:57:51 nvme -- scripts/common.sh@366 -- # ver2[v]=2 00:07:31.016 04:57:51 nvme -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:07:31.016 04:57:51 nvme -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:07:31.016 04:57:51 nvme -- scripts/common.sh@368 -- # return 0 00:07:31.016 04:57:51 nvme -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:07:31.016 04:57:51 nvme -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:07:31.016 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.016 --rc genhtml_branch_coverage=1 00:07:31.016 --rc genhtml_function_coverage=1 00:07:31.016 --rc genhtml_legend=1 00:07:31.016 --rc geninfo_all_blocks=1 00:07:31.016 --rc geninfo_unexecuted_blocks=1 00:07:31.016 00:07:31.016 ' 00:07:31.016 04:57:51 nvme -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:07:31.016 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.016 --rc genhtml_branch_coverage=1 00:07:31.016 --rc genhtml_function_coverage=1 00:07:31.016 --rc genhtml_legend=1 00:07:31.016 --rc geninfo_all_blocks=1 00:07:31.016 --rc geninfo_unexecuted_blocks=1 00:07:31.016 00:07:31.016 ' 00:07:31.016 04:57:51 nvme -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:07:31.016 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.016 --rc genhtml_branch_coverage=1 00:07:31.016 --rc genhtml_function_coverage=1 00:07:31.016 --rc genhtml_legend=1 00:07:31.016 --rc geninfo_all_blocks=1 00:07:31.016 --rc geninfo_unexecuted_blocks=1 00:07:31.016 00:07:31.016 ' 00:07:31.016 04:57:51 nvme -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:07:31.016 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:07:31.016 --rc genhtml_branch_coverage=1 00:07:31.016 --rc genhtml_function_coverage=1 00:07:31.016 --rc genhtml_legend=1 00:07:31.016 --rc geninfo_all_blocks=1 00:07:31.016 --rc geninfo_unexecuted_blocks=1 00:07:31.016 00:07:31.016 ' 00:07:31.016 04:57:51 nvme -- nvme/nvme.sh@77 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:07:31.585 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:07:32.151 0000:00:13.0 (1b36 0010): nvme -> uio_pci_generic 00:07:32.151 0000:00:11.0 (1b36 0010): nvme -> uio_pci_generic 00:07:32.151 0000:00:10.0 (1b36 0010): nvme -> uio_pci_generic 00:07:32.151 0000:00:12.0 (1b36 0010): nvme -> uio_pci_generic 00:07:32.151 04:57:52 nvme -- nvme/nvme.sh@79 -- # uname 00:07:32.151 04:57:52 nvme -- nvme/nvme.sh@79 -- # '[' Linux = Linux ']' 00:07:32.151 04:57:52 nvme -- nvme/nvme.sh@80 -- # trap 'kill_stub -9; exit 1' SIGINT SIGTERM EXIT 00:07:32.151 04:57:52 nvme -- nvme/nvme.sh@81 -- # start_stub '-s 4096 -i 0 -m 0xE' 00:07:32.151 04:57:52 nvme -- common/autotest_common.sh@1086 -- # _start_stub '-s 4096 -i 0 -m 0xE' 00:07:32.151 04:57:52 nvme -- common/autotest_common.sh@1072 -- # _randomize_va_space=2 00:07:32.151 04:57:52 nvme -- common/autotest_common.sh@1073 -- # echo 0 00:07:32.151 04:57:52 nvme -- common/autotest_common.sh@1075 -- # stubpid=76456 00:07:32.151 Waiting for stub to ready for secondary processes... 00:07:32.151 04:57:52 nvme -- common/autotest_common.sh@1076 -- # echo Waiting for stub to ready for secondary processes... 00:07:32.151 04:57:52 nvme -- common/autotest_common.sh@1077 -- # '[' -e /var/run/spdk_stub0 ']' 00:07:32.151 04:57:52 nvme -- common/autotest_common.sh@1079 -- # [[ -e /proc/76456 ]] 00:07:32.151 04:57:52 nvme -- common/autotest_common.sh@1080 -- # sleep 1s 00:07:32.151 04:57:52 nvme -- common/autotest_common.sh@1074 -- # /home/vagrant/spdk_repo/spdk/test/app/stub/stub -s 4096 -i 0 -m 0xE 00:07:32.151 [2024-12-15 04:57:52.234117] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:07:32.151 [2024-12-15 04:57:52.234234] [ DPDK EAL parameters: stub -c 0xE -m 4096 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto --proc-type=primary ] 00:07:33.087 [2024-12-15 04:57:52.975883] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:33.087 [2024-12-15 04:57:52.989101] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 2 00:07:33.087 [2024-12-15 04:57:52.989402] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 3 00:07:33.087 [2024-12-15 04:57:52.989455] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:07:33.087 [2024-12-15 04:57:52.999711] nvme_cuse.c:1408:start_cuse_thread: *NOTICE*: Successfully started cuse thread to poll for admin commands 00:07:33.087 [2024-12-15 04:57:52.999748] nvme_cuse.c:1220:nvme_cuse_start: *NOTICE*: Creating cuse device for controller 00:07:33.087 [2024-12-15 04:57:53.013161] nvme_cuse.c: 928:cuse_session_create: *NOTICE*: fuse session for device spdk/nvme0 created 00:07:33.087 [2024-12-15 04:57:53.013339] nvme_cuse.c: 928:cuse_session_create: *NOTICE*: fuse session for device spdk/nvme0n1 created 00:07:33.087 [2024-12-15 04:57:53.013832] nvme_cuse.c:1220:nvme_cuse_start: *NOTICE*: Creating cuse device for controller 00:07:33.087 [2024-12-15 04:57:53.014006] nvme_cuse.c: 928:cuse_session_create: *NOTICE*: fuse session for device spdk/nvme1 created 00:07:33.087 [2024-12-15 04:57:53.014050] nvme_cuse.c: 928:cuse_session_create: *NOTICE*: fuse session for device spdk/nvme1n1 created 00:07:33.087 [2024-12-15 04:57:53.014467] nvme_cuse.c:1220:nvme_cuse_start: *NOTICE*: Creating cuse device for controller 00:07:33.087 [2024-12-15 04:57:53.014664] nvme_cuse.c: 928:cuse_session_create: *NOTICE*: fuse session for device spdk/nvme2 created 00:07:33.087 [2024-12-15 04:57:53.014740] nvme_cuse.c: 928:cuse_session_create: *NOTICE*: fuse session for device spdk/nvme2n1 created 00:07:33.087 [2024-12-15 04:57:53.015665] nvme_cuse.c:1220:nvme_cuse_start: *NOTICE*: Creating cuse device for controller 00:07:33.087 [2024-12-15 04:57:53.015892] nvme_cuse.c: 928:cuse_session_create: *NOTICE*: fuse session for device spdk/nvme3 created 00:07:33.087 [2024-12-15 04:57:53.015991] nvme_cuse.c: 928:cuse_session_create: *NOTICE*: fuse session for device spdk/nvme3n1 created 00:07:33.087 [2024-12-15 04:57:53.016130] nvme_cuse.c: 928:cuse_session_create: *NOTICE*: fuse session for device spdk/nvme3n2 created 00:07:33.087 [2024-12-15 04:57:53.016214] nvme_cuse.c: 928:cuse_session_create: *NOTICE*: fuse session for device spdk/nvme3n3 created 00:07:33.087 done. 00:07:33.087 04:57:53 nvme -- common/autotest_common.sh@1077 -- # '[' -e /var/run/spdk_stub0 ']' 00:07:33.087 04:57:53 nvme -- common/autotest_common.sh@1082 -- # echo done. 00:07:33.087 04:57:53 nvme -- nvme/nvme.sh@84 -- # run_test nvme_reset /home/vagrant/spdk_repo/spdk/test/nvme/reset/reset -q 64 -w write -o 4096 -t 5 00:07:33.087 04:57:53 nvme -- common/autotest_common.sh@1105 -- # '[' 10 -le 1 ']' 00:07:33.087 04:57:53 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:33.087 04:57:53 nvme -- common/autotest_common.sh@10 -- # set +x 00:07:33.087 ************************************ 00:07:33.087 START TEST nvme_reset 00:07:33.087 ************************************ 00:07:33.087 04:57:53 nvme.nvme_reset -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/reset/reset -q 64 -w write -o 4096 -t 5 00:07:33.345 Initializing NVMe Controllers 00:07:33.345 Skipping QEMU NVMe SSD at 0000:00:10.0 00:07:33.345 Skipping QEMU NVMe SSD at 0000:00:11.0 00:07:33.345 Skipping QEMU NVMe SSD at 0000:00:13.0 00:07:33.345 Skipping QEMU NVMe SSD at 0000:00:12.0 00:07:33.345 No NVMe controller found, /home/vagrant/spdk_repo/spdk/test/nvme/reset/reset exiting 00:07:33.345 00:07:33.345 real 0m0.196s 00:07:33.345 user 0m0.059s 00:07:33.345 sys 0m0.090s 00:07:33.345 04:57:53 nvme.nvme_reset -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:33.346 04:57:53 nvme.nvme_reset -- common/autotest_common.sh@10 -- # set +x 00:07:33.346 ************************************ 00:07:33.346 END TEST nvme_reset 00:07:33.346 ************************************ 00:07:33.346 04:57:53 nvme -- nvme/nvme.sh@85 -- # run_test nvme_identify nvme_identify 00:07:33.346 04:57:53 nvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:33.346 04:57:53 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:33.346 04:57:53 nvme -- common/autotest_common.sh@10 -- # set +x 00:07:33.346 ************************************ 00:07:33.346 START TEST nvme_identify 00:07:33.346 ************************************ 00:07:33.346 04:57:53 nvme.nvme_identify -- common/autotest_common.sh@1129 -- # nvme_identify 00:07:33.346 04:57:53 nvme.nvme_identify -- nvme/nvme.sh@12 -- # bdfs=() 00:07:33.346 04:57:53 nvme.nvme_identify -- nvme/nvme.sh@12 -- # local bdfs bdf 00:07:33.346 04:57:53 nvme.nvme_identify -- nvme/nvme.sh@13 -- # bdfs=($(get_nvme_bdfs)) 00:07:33.346 04:57:53 nvme.nvme_identify -- nvme/nvme.sh@13 -- # get_nvme_bdfs 00:07:33.346 04:57:53 nvme.nvme_identify -- common/autotest_common.sh@1498 -- # bdfs=() 00:07:33.346 04:57:53 nvme.nvme_identify -- common/autotest_common.sh@1498 -- # local bdfs 00:07:33.346 04:57:53 nvme.nvme_identify -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:07:33.346 04:57:53 nvme.nvme_identify -- common/autotest_common.sh@1499 -- # /home/vagrant/spdk_repo/spdk/scripts/gen_nvme.sh 00:07:33.346 04:57:53 nvme.nvme_identify -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:07:33.607 04:57:53 nvme.nvme_identify -- common/autotest_common.sh@1500 -- # (( 4 == 0 )) 00:07:33.607 04:57:53 nvme.nvme_identify -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:00:10.0 0000:00:11.0 0000:00:12.0 0000:00:13.0 00:07:33.607 04:57:53 nvme.nvme_identify -- nvme/nvme.sh@14 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_identify -i 0 00:07:33.607 [2024-12-15 04:57:53.661876] nvme_ctrlr.c:3641:nvme_ctrlr_remove_inactive_proc: *ERROR*: [0000:00:10.0, 0] process 76476 terminated unexpected 00:07:33.607 ===================================================== 00:07:33.607 NVMe Controller at 0000:00:10.0 [1b36:0010] 00:07:33.607 ===================================================== 00:07:33.607 Controller Capabilities/Features 00:07:33.607 ================================ 00:07:33.607 Vendor ID: 1b36 00:07:33.607 Subsystem Vendor ID: 1af4 00:07:33.607 Serial Number: 12340 00:07:33.607 Model Number: QEMU NVMe Ctrl 00:07:33.607 Firmware Version: 8.0.0 00:07:33.607 Recommended Arb Burst: 6 00:07:33.607 IEEE OUI Identifier: 00 54 52 00:07:33.607 Multi-path I/O 00:07:33.607 May have multiple subsystem ports: No 00:07:33.607 May have multiple controllers: No 00:07:33.607 Associated with SR-IOV VF: No 00:07:33.607 Max Data Transfer Size: 524288 00:07:33.607 Max Number of Namespaces: 256 00:07:33.607 Max Number of I/O Queues: 64 00:07:33.607 NVMe Specification Version (VS): 1.4 00:07:33.607 NVMe Specification Version (Identify): 1.4 00:07:33.607 Maximum Queue Entries: 2048 00:07:33.607 Contiguous Queues Required: Yes 00:07:33.607 Arbitration Mechanisms Supported 00:07:33.607 Weighted Round Robin: Not Supported 00:07:33.607 Vendor Specific: Not Supported 00:07:33.607 Reset Timeout: 7500 ms 00:07:33.607 Doorbell Stride: 4 bytes 00:07:33.607 NVM Subsystem Reset: Not Supported 00:07:33.607 Command Sets Supported 00:07:33.607 NVM Command Set: Supported 00:07:33.607 Boot Partition: Not Supported 00:07:33.607 Memory Page Size Minimum: 4096 bytes 00:07:33.607 Memory Page Size Maximum: 65536 bytes 00:07:33.607 Persistent Memory Region: Not Supported 00:07:33.607 Optional Asynchronous Events Supported 00:07:33.607 Namespace Attribute Notices: Supported 00:07:33.607 Firmware Activation Notices: Not Supported 00:07:33.607 ANA Change Notices: Not Supported 00:07:33.607 PLE Aggregate Log Change Notices: Not Supported 00:07:33.607 LBA Status Info Alert Notices: Not Supported 00:07:33.607 EGE Aggregate Log Change Notices: Not Supported 00:07:33.607 Normal NVM Subsystem Shutdown event: Not Supported 00:07:33.607 Zone Descriptor Change Notices: Not Supported 00:07:33.607 Discovery Log Change Notices: Not Supported 00:07:33.607 Controller Attributes 00:07:33.607 128-bit Host Identifier: Not Supported 00:07:33.607 Non-Operational Permissive Mode: Not Supported 00:07:33.607 NVM Sets: Not Supported 00:07:33.607 Read Recovery Levels: Not Supported 00:07:33.607 Endurance Groups: Not Supported 00:07:33.607 Predictable Latency Mode: Not Supported 00:07:33.607 Traffic Based Keep ALive: Not Supported 00:07:33.607 Namespace Granularity: Not Supported 00:07:33.607 SQ Associations: Not Supported 00:07:33.607 UUID List: Not Supported 00:07:33.607 Multi-Domain Subsystem: Not Supported 00:07:33.607 Fixed Capacity Management: Not Supported 00:07:33.607 Variable Capacity Management: Not Supported 00:07:33.607 Delete Endurance Group: Not Supported 00:07:33.607 Delete NVM Set: Not Supported 00:07:33.607 Extended LBA Formats Supported: Supported 00:07:33.607 Flexible Data Placement Supported: Not Supported 00:07:33.607 00:07:33.607 Controller Memory Buffer Support 00:07:33.607 ================================ 00:07:33.607 Supported: No 00:07:33.607 00:07:33.607 Persistent Memory Region Support 00:07:33.607 ================================ 00:07:33.607 Supported: No 00:07:33.607 00:07:33.607 Admin Command Set Attributes 00:07:33.607 ============================ 00:07:33.607 Security Send/Receive: Not Supported 00:07:33.607 Format NVM: Supported 00:07:33.607 Firmware Activate/Download: Not Supported 00:07:33.607 Namespace Management: Supported 00:07:33.607 Device Self-Test: Not Supported 00:07:33.607 Directives: Supported 00:07:33.607 NVMe-MI: Not Supported 00:07:33.607 Virtualization Management: Not Supported 00:07:33.607 Doorbell Buffer Config: Supported 00:07:33.607 Get LBA Status Capability: Not Supported 00:07:33.607 Command & Feature Lockdown Capability: Not Supported 00:07:33.608 Abort Command Limit: 4 00:07:33.608 Async Event Request Limit: 4 00:07:33.608 Number of Firmware Slots: N/A 00:07:33.608 Firmware Slot 1 Read-Only: N/A 00:07:33.608 Firmware Activation Without Reset: N/A 00:07:33.608 Multiple Update Detection Support: N/A 00:07:33.608 Firmware Update Granularity: No Information Provided 00:07:33.608 Per-Namespace SMART Log: Yes 00:07:33.608 Asymmetric Namespace Access Log Page: Not Supported 00:07:33.608 Subsystem NQN: nqn.2019-08.org.qemu:12340 00:07:33.608 Command Effects Log Page: Supported 00:07:33.608 Get Log Page Extended Data: Supported 00:07:33.608 Telemetry Log Pages: Not Supported 00:07:33.608 Persistent Event Log Pages: Not Supported 00:07:33.608 Supported Log Pages Log Page: May Support 00:07:33.608 Commands Supported & Effects Log Page: Not Supported 00:07:33.608 Feature Identifiers & Effects Log Page:May Support 00:07:33.608 NVMe-MI Commands & Effects Log Page: May Support 00:07:33.608 Data Area 4 for Telemetry Log: Not Supported 00:07:33.608 Error Log Page Entries Supported: 1 00:07:33.608 Keep Alive: Not Supported 00:07:33.608 00:07:33.608 NVM Command Set Attributes 00:07:33.608 ========================== 00:07:33.608 Submission Queue Entry Size 00:07:33.608 Max: 64 00:07:33.608 Min: 64 00:07:33.608 Completion Queue Entry Size 00:07:33.608 Max: 16 00:07:33.608 Min: 16 00:07:33.608 Number of Namespaces: 256 00:07:33.608 Compare Command: Supported 00:07:33.608 Write Uncorrectable Command: Not Supported 00:07:33.608 Dataset Management Command: Supported 00:07:33.608 Write Zeroes Command: Supported 00:07:33.608 Set Features Save Field: Supported 00:07:33.608 Reservations: Not Supported 00:07:33.608 Timestamp: Supported 00:07:33.608 Copy: Supported 00:07:33.608 Volatile Write Cache: Present 00:07:33.608 Atomic Write Unit (Normal): 1 00:07:33.608 Atomic Write Unit (PFail): 1 00:07:33.608 Atomic Compare & Write Unit: 1 00:07:33.608 Fused Compare & Write: Not Supported 00:07:33.608 Scatter-Gather List 00:07:33.608 SGL Command Set: Supported 00:07:33.608 SGL Keyed: Not Supported 00:07:33.608 SGL Bit Bucket Descriptor: Not Supported 00:07:33.608 SGL Metadata Pointer: Not Supported 00:07:33.608 Oversized SGL: Not Supported 00:07:33.608 SGL Metadata Address: Not Supported 00:07:33.608 SGL Offset: Not Supported 00:07:33.608 Transport SGL Data Block: Not Supported 00:07:33.608 Replay Protected Memory Block: Not Supported 00:07:33.608 00:07:33.608 Firmware Slot Information 00:07:33.608 ========================= 00:07:33.608 Active slot: 1 00:07:33.608 Slot 1 Firmware Revision: 1.0 00:07:33.608 00:07:33.608 00:07:33.608 Commands Supported and Effects 00:07:33.608 ============================== 00:07:33.608 Admin Commands 00:07:33.608 -------------- 00:07:33.608 Delete I/O Submission Queue (00h): Supported 00:07:33.608 Create I/O Submission Queue (01h): Supported 00:07:33.608 Get Log Page (02h): Supported 00:07:33.608 Delete I/O Completion Queue (04h): Supported 00:07:33.608 Create I/O Completion Queue (05h): Supported 00:07:33.608 Identify (06h): Supported 00:07:33.608 Abort (08h): Supported 00:07:33.608 Set Features (09h): Supported 00:07:33.608 Get Features (0Ah): Supported 00:07:33.608 Asynchronous Event Request (0Ch): Supported 00:07:33.608 Namespace Attachment (15h): Supported NS-Inventory-Change 00:07:33.608 Directive Send (19h): Supported 00:07:33.608 Directive Receive (1Ah): Supported 00:07:33.608 Virtualization Management (1Ch): Supported 00:07:33.608 Doorbell Buffer Config (7Ch): Supported 00:07:33.608 Format NVM (80h): Supported LBA-Change 00:07:33.608 I/O Commands 00:07:33.608 ------------ 00:07:33.608 Flush (00h): Supported LBA-Change 00:07:33.608 Write (01h): Supported LBA-Change 00:07:33.608 Read (02h): Supported 00:07:33.608 Compare (05h): Supported 00:07:33.608 Write Zeroes (08h): Supported LBA-Change 00:07:33.608 Dataset Management (09h): Supported LBA-Change 00:07:33.608 Unknown (0Ch): Supported 00:07:33.608 Unknown (12h): Supported 00:07:33.608 Copy (19h): Supported LBA-Change 00:07:33.608 Unknown (1Dh): Supported LBA-Change 00:07:33.608 00:07:33.608 Error Log 00:07:33.608 ========= 00:07:33.608 00:07:33.608 Arbitration 00:07:33.608 =========== 00:07:33.608 Arbitration Burst: no limit 00:07:33.608 00:07:33.608 Power Management 00:07:33.608 ================ 00:07:33.608 Number of Power States: 1 00:07:33.608 Current Power State: Power State #0 00:07:33.608 Power State #0: 00:07:33.608 Max Power: 25.00 W 00:07:33.608 Non-Operational State: Operational 00:07:33.608 Entry Latency: 16 microseconds 00:07:33.608 Exit Latency: 4 microseconds 00:07:33.608 Relative Read Throughput: 0 00:07:33.608 Relative Read Latency: 0 00:07:33.608 Relative Write Throughput: 0 00:07:33.608 Relative Write Latency: 0 00:07:33.608 Idle Power[2024-12-15 04:57:53.663185] nvme_ctrlr.c:3641:nvme_ctrlr_remove_inactive_proc: *ERROR*: [0000:00:11.0, 0] process 76476 terminated unexpected 00:07:33.608 : Not Reported 00:07:33.608 Active Power: Not Reported 00:07:33.608 Non-Operational Permissive Mode: Not Supported 00:07:33.608 00:07:33.608 Health Information 00:07:33.608 ================== 00:07:33.608 Critical Warnings: 00:07:33.608 Available Spare Space: OK 00:07:33.608 Temperature: OK 00:07:33.608 Device Reliability: OK 00:07:33.608 Read Only: No 00:07:33.608 Volatile Memory Backup: OK 00:07:33.608 Current Temperature: 323 Kelvin (50 Celsius) 00:07:33.608 Temperature Threshold: 343 Kelvin (70 Celsius) 00:07:33.608 Available Spare: 0% 00:07:33.608 Available Spare Threshold: 0% 00:07:33.608 Life Percentage Used: 0% 00:07:33.608 Data Units Read: 640 00:07:33.608 Data Units Written: 568 00:07:33.608 Host Read Commands: 33519 00:07:33.608 Host Write Commands: 33305 00:07:33.608 Controller Busy Time: 0 minutes 00:07:33.608 Power Cycles: 0 00:07:33.608 Power On Hours: 0 hours 00:07:33.608 Unsafe Shutdowns: 0 00:07:33.608 Unrecoverable Media Errors: 0 00:07:33.608 Lifetime Error Log Entries: 0 00:07:33.608 Warning Temperature Time: 0 minutes 00:07:33.608 Critical Temperature Time: 0 minutes 00:07:33.608 00:07:33.608 Number of Queues 00:07:33.608 ================ 00:07:33.608 Number of I/O Submission Queues: 64 00:07:33.608 Number of I/O Completion Queues: 64 00:07:33.608 00:07:33.608 ZNS Specific Controller Data 00:07:33.608 ============================ 00:07:33.608 Zone Append Size Limit: 0 00:07:33.608 00:07:33.608 00:07:33.608 Active Namespaces 00:07:33.608 ================= 00:07:33.608 Namespace ID:1 00:07:33.608 Error Recovery Timeout: Unlimited 00:07:33.608 Command Set Identifier: NVM (00h) 00:07:33.608 Deallocate: Supported 00:07:33.608 Deallocated/Unwritten Error: Supported 00:07:33.608 Deallocated Read Value: All 0x00 00:07:33.608 Deallocate in Write Zeroes: Not Supported 00:07:33.608 Deallocated Guard Field: 0xFFFF 00:07:33.608 Flush: Supported 00:07:33.608 Reservation: Not Supported 00:07:33.608 Metadata Transferred as: Separate Metadata Buffer 00:07:33.608 Namespace Sharing Capabilities: Private 00:07:33.608 Size (in LBAs): 1548666 (5GiB) 00:07:33.608 Capacity (in LBAs): 1548666 (5GiB) 00:07:33.608 Utilization (in LBAs): 1548666 (5GiB) 00:07:33.608 Thin Provisioning: Not Supported 00:07:33.608 Per-NS Atomic Units: No 00:07:33.608 Maximum Single Source Range Length: 128 00:07:33.608 Maximum Copy Length: 128 00:07:33.608 Maximum Source Range Count: 128 00:07:33.608 NGUID/EUI64 Never Reused: No 00:07:33.608 Namespace Write Protected: No 00:07:33.608 Number of LBA Formats: 8 00:07:33.608 Current LBA Format: LBA Format #07 00:07:33.608 LBA Format #00: Data Size: 512 Metadata Size: 0 00:07:33.608 LBA Format #01: Data Size: 512 Metadata Size: 8 00:07:33.608 LBA Format #02: Data Size: 512 Metadata Size: 16 00:07:33.608 LBA Format #03: Data Size: 512 Metadata Size: 64 00:07:33.608 LBA Format #04: Data Size: 4096 Metadata Size: 0 00:07:33.608 LBA Format #05: Data Size: 4096 Metadata Size: 8 00:07:33.608 LBA Format #06: Data Size: 4096 Metadata Size: 16 00:07:33.608 LBA Format #07: Data Size: 4096 Metadata Size: 64 00:07:33.608 00:07:33.608 NVM Specific Namespace Data 00:07:33.608 =========================== 00:07:33.608 Logical Block Storage Tag Mask: 0 00:07:33.608 Protection Information Capabilities: 00:07:33.608 16b Guard Protection Information Storage Tag Support: No 00:07:33.608 16b Guard Protection Information Storage Tag Mask: Any bit in LBSTM can be 0 00:07:33.608 Storage Tag Check Read Support: No 00:07:33.608 Extended LBA Format #00: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.608 Extended LBA Format #01: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.608 Extended LBA Format #02: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.608 Extended LBA Format #03: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.608 Extended LBA Format #04: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.608 Extended LBA Format #05: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.608 Extended LBA Format #06: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.608 Extended LBA Format #07: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.608 ===================================================== 00:07:33.608 NVMe Controller at 0000:00:11.0 [1b36:0010] 00:07:33.608 ===================================================== 00:07:33.608 Controller Capabilities/Features 00:07:33.609 ================================ 00:07:33.609 Vendor ID: 1b36 00:07:33.609 Subsystem Vendor ID: 1af4 00:07:33.609 Serial Number: 12341 00:07:33.609 Model Number: QEMU NVMe Ctrl 00:07:33.609 Firmware Version: 8.0.0 00:07:33.609 Recommended Arb Burst: 6 00:07:33.609 IEEE OUI Identifier: 00 54 52 00:07:33.609 Multi-path I/O 00:07:33.609 May have multiple subsystem ports: No 00:07:33.609 May have multiple controllers: No 00:07:33.609 Associated with SR-IOV VF: No 00:07:33.609 Max Data Transfer Size: 524288 00:07:33.609 Max Number of Namespaces: 256 00:07:33.609 Max Number of I/O Queues: 64 00:07:33.609 NVMe Specification Version (VS): 1.4 00:07:33.609 NVMe Specification Version (Identify): 1.4 00:07:33.609 Maximum Queue Entries: 2048 00:07:33.609 Contiguous Queues Required: Yes 00:07:33.609 Arbitration Mechanisms Supported 00:07:33.609 Weighted Round Robin: Not Supported 00:07:33.609 Vendor Specific: Not Supported 00:07:33.609 Reset Timeout: 7500 ms 00:07:33.609 Doorbell Stride: 4 bytes 00:07:33.609 NVM Subsystem Reset: Not Supported 00:07:33.609 Command Sets Supported 00:07:33.609 NVM Command Set: Supported 00:07:33.609 Boot Partition: Not Supported 00:07:33.609 Memory Page Size Minimum: 4096 bytes 00:07:33.609 Memory Page Size Maximum: 65536 bytes 00:07:33.609 Persistent Memory Region: Not Supported 00:07:33.609 Optional Asynchronous Events Supported 00:07:33.609 Namespace Attribute Notices: Supported 00:07:33.609 Firmware Activation Notices: Not Supported 00:07:33.609 ANA Change Notices: Not Supported 00:07:33.609 PLE Aggregate Log Change Notices: Not Supported 00:07:33.609 LBA Status Info Alert Notices: Not Supported 00:07:33.609 EGE Aggregate Log Change Notices: Not Supported 00:07:33.609 Normal NVM Subsystem Shutdown event: Not Supported 00:07:33.609 Zone Descriptor Change Notices: Not Supported 00:07:33.609 Discovery Log Change Notices: Not Supported 00:07:33.609 Controller Attributes 00:07:33.609 128-bit Host Identifier: Not Supported 00:07:33.609 Non-Operational Permissive Mode: Not Supported 00:07:33.609 NVM Sets: Not Supported 00:07:33.609 Read Recovery Levels: Not Supported 00:07:33.609 Endurance Groups: Not Supported 00:07:33.609 Predictable Latency Mode: Not Supported 00:07:33.609 Traffic Based Keep ALive: Not Supported 00:07:33.609 Namespace Granularity: Not Supported 00:07:33.609 SQ Associations: Not Supported 00:07:33.609 UUID List: Not Supported 00:07:33.609 Multi-Domain Subsystem: Not Supported 00:07:33.609 Fixed Capacity Management: Not Supported 00:07:33.609 Variable Capacity Management: Not Supported 00:07:33.609 Delete Endurance Group: Not Supported 00:07:33.609 Delete NVM Set: Not Supported 00:07:33.609 Extended LBA Formats Supported: Supported 00:07:33.609 Flexible Data Placement Supported: Not Supported 00:07:33.609 00:07:33.609 Controller Memory Buffer Support 00:07:33.609 ================================ 00:07:33.609 Supported: No 00:07:33.609 00:07:33.609 Persistent Memory Region Support 00:07:33.609 ================================ 00:07:33.609 Supported: No 00:07:33.609 00:07:33.609 Admin Command Set Attributes 00:07:33.609 ============================ 00:07:33.609 Security Send/Receive: Not Supported 00:07:33.609 Format NVM: Supported 00:07:33.609 Firmware Activate/Download: Not Supported 00:07:33.609 Namespace Management: Supported 00:07:33.609 Device Self-Test: Not Supported 00:07:33.609 Directives: Supported 00:07:33.609 NVMe-MI: Not Supported 00:07:33.609 Virtualization Management: Not Supported 00:07:33.609 Doorbell Buffer Config: Supported 00:07:33.609 Get LBA Status Capability: Not Supported 00:07:33.609 Command & Feature Lockdown Capability: Not Supported 00:07:33.609 Abort Command Limit: 4 00:07:33.609 Async Event Request Limit: 4 00:07:33.609 Number of Firmware Slots: N/A 00:07:33.609 Firmware Slot 1 Read-Only: N/A 00:07:33.609 Firmware Activation Without Reset: N/A 00:07:33.609 Multiple Update Detection Support: N/A 00:07:33.609 Firmware Update Granularity: No Information Provided 00:07:33.609 Per-Namespace SMART Log: Yes 00:07:33.609 Asymmetric Namespace Access Log Page: Not Supported 00:07:33.609 Subsystem NQN: nqn.2019-08.org.qemu:12341 00:07:33.609 Command Effects Log Page: Supported 00:07:33.609 Get Log Page Extended Data: Supported 00:07:33.609 Telemetry Log Pages: Not Supported 00:07:33.609 Persistent Event Log Pages: Not Supported 00:07:33.609 Supported Log Pages Log Page: May Support 00:07:33.609 Commands Supported & Effects Log Page: Not Supported 00:07:33.609 Feature Identifiers & Effects Log Page:May Support 00:07:33.609 NVMe-MI Commands & Effects Log Page: May Support 00:07:33.609 Data Area 4 for Telemetry Log: Not Supported 00:07:33.609 Error Log Page Entries Supported: 1 00:07:33.609 Keep Alive: Not Supported 00:07:33.609 00:07:33.609 NVM Command Set Attributes 00:07:33.609 ========================== 00:07:33.609 Submission Queue Entry Size 00:07:33.609 Max: 64 00:07:33.609 Min: 64 00:07:33.609 Completion Queue Entry Size 00:07:33.609 Max: 16 00:07:33.609 Min: 16 00:07:33.609 Number of Namespaces: 256 00:07:33.609 Compare Command: Supported 00:07:33.609 Write Uncorrectable Command: Not Supported 00:07:33.609 Dataset Management Command: Supported 00:07:33.609 Write Zeroes Command: Supported 00:07:33.609 Set Features Save Field: Supported 00:07:33.609 Reservations: Not Supported 00:07:33.609 Timestamp: Supported 00:07:33.609 Copy: Supported 00:07:33.609 Volatile Write Cache: Present 00:07:33.609 Atomic Write Unit (Normal): 1 00:07:33.609 Atomic Write Unit (PFail): 1 00:07:33.609 Atomic Compare & Write Unit: 1 00:07:33.609 Fused Compare & Write: Not Supported 00:07:33.609 Scatter-Gather List 00:07:33.609 SGL Command Set: Supported 00:07:33.609 SGL Keyed: Not Supported 00:07:33.609 SGL Bit Bucket Descriptor: Not Supported 00:07:33.609 SGL Metadata Pointer: Not Supported 00:07:33.609 Oversized SGL: Not Supported 00:07:33.609 SGL Metadata Address: Not Supported 00:07:33.609 SGL Offset: Not Supported 00:07:33.609 Transport SGL Data Block: Not Supported 00:07:33.609 Replay Protected Memory Block: Not Supported 00:07:33.609 00:07:33.609 Firmware Slot Information 00:07:33.609 ========================= 00:07:33.609 Active slot: 1 00:07:33.609 Slot 1 Firmware Revision: 1.0 00:07:33.609 00:07:33.609 00:07:33.609 Commands Supported and Effects 00:07:33.609 ============================== 00:07:33.609 Admin Commands 00:07:33.609 -------------- 00:07:33.609 Delete I/O Submission Queue (00h): Supported 00:07:33.609 Create I/O Submission Queue (01h): Supported 00:07:33.609 Get Log Page (02h): Supported 00:07:33.609 Delete I/O Completion Queue (04h): Supported 00:07:33.609 Create I/O Completion Queue (05h): Supported 00:07:33.609 Identify (06h): Supported 00:07:33.609 Abort (08h): Supported 00:07:33.609 Set Features (09h): Supported 00:07:33.609 Get Features (0Ah): Supported 00:07:33.609 Asynchronous Event Request (0Ch): Supported 00:07:33.609 Namespace Attachment (15h): Supported NS-Inventory-Change 00:07:33.609 Directive Send (19h): Supported 00:07:33.609 Directive Receive (1Ah): Supported 00:07:33.609 Virtualization Management (1Ch): Supported 00:07:33.609 Doorbell Buffer Config (7Ch): Supported 00:07:33.609 Format NVM (80h): Supported LBA-Change 00:07:33.609 I/O Commands 00:07:33.609 ------------ 00:07:33.609 Flush (00h): Supported LBA-Change 00:07:33.609 Write (01h): Supported LBA-Change 00:07:33.609 Read (02h): Supported 00:07:33.609 Compare (05h): Supported 00:07:33.609 Write Zeroes (08h): Supported LBA-Change 00:07:33.609 Dataset Management (09h): Supported LBA-Change 00:07:33.609 Unknown (0Ch): Supported 00:07:33.609 Unknown (12h): Supported 00:07:33.609 Copy (19h): Supported LBA-Change 00:07:33.609 Unknown (1Dh): Supported LBA-Change 00:07:33.609 00:07:33.609 Error Log 00:07:33.609 ========= 00:07:33.609 00:07:33.609 Arbitration 00:07:33.609 =========== 00:07:33.609 Arbitration Burst: no limit 00:07:33.609 00:07:33.609 Power Management 00:07:33.609 ================ 00:07:33.609 Number of Power States: 1 00:07:33.609 Current Power State: Power State #0 00:07:33.609 Power State #0: 00:07:33.609 Max Power: 25.00 W 00:07:33.609 Non-Operational State: Operational 00:07:33.609 Entry Latency: 16 microseconds 00:07:33.609 Exit Latency: 4 microseconds 00:07:33.609 Relative Read Throughput: 0 00:07:33.609 Relative Read Latency: 0 00:07:33.609 Relative Write Throughput: 0 00:07:33.609 Relative Write Latency: 0 00:07:33.609 Idle Power: Not Reported 00:07:33.609 Active Power: Not Reported 00:07:33.609 Non-Operational Permissive Mode: Not Supported 00:07:33.609 00:07:33.609 Health Information 00:07:33.609 ================== 00:07:33.609 Critical Warnings: 00:07:33.609 Available Spare Space: OK 00:07:33.609 Temperature: [2024-12-15 04:57:53.664069] nvme_ctrlr.c:3641:nvme_ctrlr_remove_inactive_proc: *ERROR*: [0000:00:13.0, 0] process 76476 terminated unexpected 00:07:33.609 OK 00:07:33.609 Device Reliability: OK 00:07:33.609 Read Only: No 00:07:33.609 Volatile Memory Backup: OK 00:07:33.609 Current Temperature: 323 Kelvin (50 Celsius) 00:07:33.610 Temperature Threshold: 343 Kelvin (70 Celsius) 00:07:33.610 Available Spare: 0% 00:07:33.610 Available Spare Threshold: 0% 00:07:33.610 Life Percentage Used: 0% 00:07:33.610 Data Units Read: 981 00:07:33.610 Data Units Written: 853 00:07:33.610 Host Read Commands: 50667 00:07:33.610 Host Write Commands: 49511 00:07:33.610 Controller Busy Time: 0 minutes 00:07:33.610 Power Cycles: 0 00:07:33.610 Power On Hours: 0 hours 00:07:33.610 Unsafe Shutdowns: 0 00:07:33.610 Unrecoverable Media Errors: 0 00:07:33.610 Lifetime Error Log Entries: 0 00:07:33.610 Warning Temperature Time: 0 minutes 00:07:33.610 Critical Temperature Time: 0 minutes 00:07:33.610 00:07:33.610 Number of Queues 00:07:33.610 ================ 00:07:33.610 Number of I/O Submission Queues: 64 00:07:33.610 Number of I/O Completion Queues: 64 00:07:33.610 00:07:33.610 ZNS Specific Controller Data 00:07:33.610 ============================ 00:07:33.610 Zone Append Size Limit: 0 00:07:33.610 00:07:33.610 00:07:33.610 Active Namespaces 00:07:33.610 ================= 00:07:33.610 Namespace ID:1 00:07:33.610 Error Recovery Timeout: Unlimited 00:07:33.610 Command Set Identifier: NVM (00h) 00:07:33.610 Deallocate: Supported 00:07:33.610 Deallocated/Unwritten Error: Supported 00:07:33.610 Deallocated Read Value: All 0x00 00:07:33.610 Deallocate in Write Zeroes: Not Supported 00:07:33.610 Deallocated Guard Field: 0xFFFF 00:07:33.610 Flush: Supported 00:07:33.610 Reservation: Not Supported 00:07:33.610 Namespace Sharing Capabilities: Private 00:07:33.610 Size (in LBAs): 1310720 (5GiB) 00:07:33.610 Capacity (in LBAs): 1310720 (5GiB) 00:07:33.610 Utilization (in LBAs): 1310720 (5GiB) 00:07:33.610 Thin Provisioning: Not Supported 00:07:33.610 Per-NS Atomic Units: No 00:07:33.610 Maximum Single Source Range Length: 128 00:07:33.610 Maximum Copy Length: 128 00:07:33.610 Maximum Source Range Count: 128 00:07:33.610 NGUID/EUI64 Never Reused: No 00:07:33.610 Namespace Write Protected: No 00:07:33.610 Number of LBA Formats: 8 00:07:33.610 Current LBA Format: LBA Format #04 00:07:33.610 LBA Format #00: Data Size: 512 Metadata Size: 0 00:07:33.610 LBA Format #01: Data Size: 512 Metadata Size: 8 00:07:33.610 LBA Format #02: Data Size: 512 Metadata Size: 16 00:07:33.610 LBA Format #03: Data Size: 512 Metadata Size: 64 00:07:33.610 LBA Format #04: Data Size: 4096 Metadata Size: 0 00:07:33.610 LBA Format #05: Data Size: 4096 Metadata Size: 8 00:07:33.610 LBA Format #06: Data Size: 4096 Metadata Size: 16 00:07:33.610 LBA Format #07: Data Size: 4096 Metadata Size: 64 00:07:33.610 00:07:33.610 NVM Specific Namespace Data 00:07:33.610 =========================== 00:07:33.610 Logical Block Storage Tag Mask: 0 00:07:33.610 Protection Information Capabilities: 00:07:33.610 16b Guard Protection Information Storage Tag Support: No 00:07:33.610 16b Guard Protection Information Storage Tag Mask: Any bit in LBSTM can be 0 00:07:33.610 Storage Tag Check Read Support: No 00:07:33.610 Extended LBA Format #00: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.610 Extended LBA Format #01: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.610 Extended LBA Format #02: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.610 Extended LBA Format #03: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.610 Extended LBA Format #04: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.610 Extended LBA Format #05: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.610 Extended LBA Format #06: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.610 Extended LBA Format #07: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.610 ===================================================== 00:07:33.610 NVMe Controller at 0000:00:13.0 [1b36:0010] 00:07:33.610 ===================================================== 00:07:33.610 Controller Capabilities/Features 00:07:33.610 ================================ 00:07:33.610 Vendor ID: 1b36 00:07:33.610 Subsystem Vendor ID: 1af4 00:07:33.610 Serial Number: 12343 00:07:33.610 Model Number: QEMU NVMe Ctrl 00:07:33.610 Firmware Version: 8.0.0 00:07:33.610 Recommended Arb Burst: 6 00:07:33.610 IEEE OUI Identifier: 00 54 52 00:07:33.610 Multi-path I/O 00:07:33.610 May have multiple subsystem ports: No 00:07:33.610 May have multiple controllers: Yes 00:07:33.610 Associated with SR-IOV VF: No 00:07:33.610 Max Data Transfer Size: 524288 00:07:33.610 Max Number of Namespaces: 256 00:07:33.610 Max Number of I/O Queues: 64 00:07:33.610 NVMe Specification Version (VS): 1.4 00:07:33.610 NVMe Specification Version (Identify): 1.4 00:07:33.610 Maximum Queue Entries: 2048 00:07:33.610 Contiguous Queues Required: Yes 00:07:33.610 Arbitration Mechanisms Supported 00:07:33.610 Weighted Round Robin: Not Supported 00:07:33.610 Vendor Specific: Not Supported 00:07:33.610 Reset Timeout: 7500 ms 00:07:33.610 Doorbell Stride: 4 bytes 00:07:33.610 NVM Subsystem Reset: Not Supported 00:07:33.610 Command Sets Supported 00:07:33.610 NVM Command Set: Supported 00:07:33.610 Boot Partition: Not Supported 00:07:33.610 Memory Page Size Minimum: 4096 bytes 00:07:33.610 Memory Page Size Maximum: 65536 bytes 00:07:33.610 Persistent Memory Region: Not Supported 00:07:33.610 Optional Asynchronous Events Supported 00:07:33.610 Namespace Attribute Notices: Supported 00:07:33.610 Firmware Activation Notices: Not Supported 00:07:33.610 ANA Change Notices: Not Supported 00:07:33.610 PLE Aggregate Log Change Notices: Not Supported 00:07:33.610 LBA Status Info Alert Notices: Not Supported 00:07:33.610 EGE Aggregate Log Change Notices: Not Supported 00:07:33.610 Normal NVM Subsystem Shutdown event: Not Supported 00:07:33.610 Zone Descriptor Change Notices: Not Supported 00:07:33.610 Discovery Log Change Notices: Not Supported 00:07:33.610 Controller Attributes 00:07:33.610 128-bit Host Identifier: Not Supported 00:07:33.610 Non-Operational Permissive Mode: Not Supported 00:07:33.610 NVM Sets: Not Supported 00:07:33.610 Read Recovery Levels: Not Supported 00:07:33.610 Endurance Groups: Supported 00:07:33.610 Predictable Latency Mode: Not Supported 00:07:33.610 Traffic Based Keep ALive: Not Supported 00:07:33.610 Namespace Granularity: Not Supported 00:07:33.610 SQ Associations: Not Supported 00:07:33.610 UUID List: Not Supported 00:07:33.610 Multi-Domain Subsystem: Not Supported 00:07:33.610 Fixed Capacity Management: Not Supported 00:07:33.610 Variable Capacity Management: Not Supported 00:07:33.610 Delete Endurance Group: Not Supported 00:07:33.610 Delete NVM Set: Not Supported 00:07:33.610 Extended LBA Formats Supported: Supported 00:07:33.610 Flexible Data Placement Supported: Supported 00:07:33.610 00:07:33.610 Controller Memory Buffer Support 00:07:33.610 ================================ 00:07:33.610 Supported: No 00:07:33.610 00:07:33.610 Persistent Memory Region Support 00:07:33.610 ================================ 00:07:33.610 Supported: No 00:07:33.610 00:07:33.610 Admin Command Set Attributes 00:07:33.610 ============================ 00:07:33.610 Security Send/Receive: Not Supported 00:07:33.610 Format NVM: Supported 00:07:33.610 Firmware Activate/Download: Not Supported 00:07:33.610 Namespace Management: Supported 00:07:33.610 Device Self-Test: Not Supported 00:07:33.610 Directives: Supported 00:07:33.610 NVMe-MI: Not Supported 00:07:33.610 Virtualization Management: Not Supported 00:07:33.610 Doorbell Buffer Config: Supported 00:07:33.610 Get LBA Status Capability: Not Supported 00:07:33.610 Command & Feature Lockdown Capability: Not Supported 00:07:33.610 Abort Command Limit: 4 00:07:33.610 Async Event Request Limit: 4 00:07:33.610 Number of Firmware Slots: N/A 00:07:33.610 Firmware Slot 1 Read-Only: N/A 00:07:33.610 Firmware Activation Without Reset: N/A 00:07:33.610 Multiple Update Detection Support: N/A 00:07:33.610 Firmware Update Granularity: No Information Provided 00:07:33.610 Per-Namespace SMART Log: Yes 00:07:33.610 Asymmetric Namespace Access Log Page: Not Supported 00:07:33.610 Subsystem NQN: nqn.2019-08.org.qemu:fdp-subsys3 00:07:33.610 Command Effects Log Page: Supported 00:07:33.610 Get Log Page Extended Data: Supported 00:07:33.610 Telemetry Log Pages: Not Supported 00:07:33.610 Persistent Event Log Pages: Not Supported 00:07:33.610 Supported Log Pages Log Page: May Support 00:07:33.610 Commands Supported & Effects Log Page: Not Supported 00:07:33.610 Feature Identifiers & Effects Log Page:May Support 00:07:33.610 NVMe-MI Commands & Effects Log Page: May Support 00:07:33.610 Data Area 4 for Telemetry Log: Not Supported 00:07:33.610 Error Log Page Entries Supported: 1 00:07:33.610 Keep Alive: Not Supported 00:07:33.610 00:07:33.610 NVM Command Set Attributes 00:07:33.610 ========================== 00:07:33.610 Submission Queue Entry Size 00:07:33.610 Max: 64 00:07:33.610 Min: 64 00:07:33.610 Completion Queue Entry Size 00:07:33.610 Max: 16 00:07:33.610 Min: 16 00:07:33.610 Number of Namespaces: 256 00:07:33.610 Compare Command: Supported 00:07:33.610 Write Uncorrectable Command: Not Supported 00:07:33.610 Dataset Management Command: Supported 00:07:33.610 Write Zeroes Command: Supported 00:07:33.610 Set Features Save Field: Supported 00:07:33.610 Reservations: Not Supported 00:07:33.610 Timestamp: Supported 00:07:33.610 Copy: Supported 00:07:33.610 Volatile Write Cache: Present 00:07:33.610 Atomic Write Unit (Normal): 1 00:07:33.611 Atomic Write Unit (PFail): 1 00:07:33.611 Atomic Compare & Write Unit: 1 00:07:33.611 Fused Compare & Write: Not Supported 00:07:33.611 Scatter-Gather List 00:07:33.611 SGL Command Set: Supported 00:07:33.611 SGL Keyed: Not Supported 00:07:33.611 SGL Bit Bucket Descriptor: Not Supported 00:07:33.611 SGL Metadata Pointer: Not Supported 00:07:33.611 Oversized SGL: Not Supported 00:07:33.611 SGL Metadata Address: Not Supported 00:07:33.611 SGL Offset: Not Supported 00:07:33.611 Transport SGL Data Block: Not Supported 00:07:33.611 Replay Protected Memory Block: Not Supported 00:07:33.611 00:07:33.611 Firmware Slot Information 00:07:33.611 ========================= 00:07:33.611 Active slot: 1 00:07:33.611 Slot 1 Firmware Revision: 1.0 00:07:33.611 00:07:33.611 00:07:33.611 Commands Supported and Effects 00:07:33.611 ============================== 00:07:33.611 Admin Commands 00:07:33.611 -------------- 00:07:33.611 Delete I/O Submission Queue (00h): Supported 00:07:33.611 Create I/O Submission Queue (01h): Supported 00:07:33.611 Get Log Page (02h): Supported 00:07:33.611 Delete I/O Completion Queue (04h): Supported 00:07:33.611 Create I/O Completion Queue (05h): Supported 00:07:33.611 Identify (06h): Supported 00:07:33.611 Abort (08h): Supported 00:07:33.611 Set Features (09h): Supported 00:07:33.611 Get Features (0Ah): Supported 00:07:33.611 Asynchronous Event Request (0Ch): Supported 00:07:33.611 Namespace Attachment (15h): Supported NS-Inventory-Change 00:07:33.611 Directive Send (19h): Supported 00:07:33.611 Directive Receive (1Ah): Supported 00:07:33.611 Virtualization Management (1Ch): Supported 00:07:33.611 Doorbell Buffer Config (7Ch): Supported 00:07:33.611 Format NVM (80h): Supported LBA-Change 00:07:33.611 I/O Commands 00:07:33.611 ------------ 00:07:33.611 Flush (00h): Supported LBA-Change 00:07:33.611 Write (01h): Supported LBA-Change 00:07:33.611 Read (02h): Supported 00:07:33.611 Compare (05h): Supported 00:07:33.611 Write Zeroes (08h): Supported LBA-Change 00:07:33.611 Dataset Management (09h): Supported LBA-Change 00:07:33.611 Unknown (0Ch): Supported 00:07:33.611 Unknown (12h): Supported 00:07:33.611 Copy (19h): Supported LBA-Change 00:07:33.611 Unknown (1Dh): Supported LBA-Change 00:07:33.611 00:07:33.611 Error Log 00:07:33.611 ========= 00:07:33.611 00:07:33.611 Arbitration 00:07:33.611 =========== 00:07:33.611 Arbitration Burst: no limit 00:07:33.611 00:07:33.611 Power Management 00:07:33.611 ================ 00:07:33.611 Number of Power States: 1 00:07:33.611 Current Power State: Power State #0 00:07:33.611 Power State #0: 00:07:33.611 Max Power: 25.00 W 00:07:33.611 Non-Operational State: Operational 00:07:33.611 Entry Latency: 16 microseconds 00:07:33.611 Exit Latency: 4 microseconds 00:07:33.611 Relative Read Throughput: 0 00:07:33.611 Relative Read Latency: 0 00:07:33.611 Relative Write Throughput: 0 00:07:33.611 Relative Write Latency: 0 00:07:33.611 Idle Power: Not Reported 00:07:33.611 Active Power: Not Reported 00:07:33.611 Non-Operational Permissive Mode: Not Supported 00:07:33.611 00:07:33.611 Health Information 00:07:33.611 ================== 00:07:33.611 Critical Warnings: 00:07:33.611 Available Spare Space: OK 00:07:33.611 Temperature: OK 00:07:33.611 Device Reliability: OK 00:07:33.611 Read Only: No 00:07:33.611 Volatile Memory Backup: OK 00:07:33.611 Current Temperature: 323 Kelvin (50 Celsius) 00:07:33.611 Temperature Threshold: 343 Kelvin (70 Celsius) 00:07:33.611 Available Spare: 0% 00:07:33.611 Available Spare Threshold: 0% 00:07:33.611 Life Percentage Used: 0% 00:07:33.611 Data Units Read: 1032 00:07:33.611 Data Units Written: 961 00:07:33.611 Host Read Commands: 37123 00:07:33.611 Host Write Commands: 36546 00:07:33.611 Controller Busy Time: 0 minutes 00:07:33.611 Power Cycles: 0 00:07:33.611 Power On Hours: 0 hours 00:07:33.611 Unsafe Shutdowns: 0 00:07:33.611 Unrecoverable Media Errors: 0 00:07:33.611 Lifetime Error Log Entries: 0 00:07:33.611 Warning Temperature Time: 0 minutes 00:07:33.611 Critical Temperature Time: 0 minutes 00:07:33.611 00:07:33.611 Number of Queues 00:07:33.611 ================ 00:07:33.611 Number of I/O Submission Queues: 64 00:07:33.611 Number of I/O Completion Queues: 64 00:07:33.611 00:07:33.611 ZNS Specific Controller Data 00:07:33.611 ============================ 00:07:33.611 Zone Append Size Limit: 0 00:07:33.611 00:07:33.611 00:07:33.611 Active Namespaces 00:07:33.611 ================= 00:07:33.611 Namespace ID:1 00:07:33.611 Error Recovery Timeout: Unlimited 00:07:33.611 Command Set Identifier: NVM (00h) 00:07:33.611 Deallocate: Supported 00:07:33.611 Deallocated/Unwritten Error: Supported 00:07:33.611 Deallocated Read Value: All 0x00 00:07:33.611 Deallocate in Write Zeroes: Not Supported 00:07:33.611 Deallocated Guard Field: 0xFFFF 00:07:33.611 Flush: Supported 00:07:33.611 Reservation: Not Supported 00:07:33.611 Namespace Sharing Capabilities: Multiple Controllers 00:07:33.611 Size (in LBAs): 262144 (1GiB) 00:07:33.611 Capacity (in LBAs): 262144 (1GiB) 00:07:33.611 Utilization (in LBAs): 262144 (1GiB) 00:07:33.611 Thin Provisioning: Not Supported 00:07:33.611 Per-NS Atomic Units: No 00:07:33.611 Maximum Single Source Range Length: 128 00:07:33.611 Maximum Copy Length: 128 00:07:33.611 Maximum Source Range Count: 128 00:07:33.611 NGUID/EUI64 Never Reused: No 00:07:33.611 Namespace Write Protected: No 00:07:33.611 Endurance group ID: 1 00:07:33.611 Number of LBA Formats: 8 00:07:33.611 Current LBA Format: LBA Format #04 00:07:33.611 LBA Format #00: Data Size: 512 Metadata Size: 0 00:07:33.611 LBA Format #01: Data Size: 512 Metadata Size: 8 00:07:33.611 LBA Format #02: Data Size: 512 Metadata Size: 16 00:07:33.611 LBA Format #03: Data Size: 512 Metadata Size: 64 00:07:33.611 LBA Format #04: Data Size: 4096 Metadata Size: 0 00:07:33.611 LBA Format #05: Data Size: 4096 Metadata Size: 8 00:07:33.611 LBA Format #06: Data Size: 4096 Metadata Size: 16 00:07:33.611 LBA Format #07: Data Size: 4096 Metadata Size: 64 00:07:33.611 00:07:33.611 Get Feature FDP: 00:07:33.611 ================ 00:07:33.611 Enabled: Yes 00:07:33.611 FDP configuration index: 0 00:07:33.611 00:07:33.611 FDP configurations log page 00:07:33.611 =========================== 00:07:33.611 Number of FDP configurations: 1 00:07:33.611 Version: 0 00:07:33.611 Size: 112 00:07:33.611 FDP Configuration Descriptor: 0 00:07:33.611 Descriptor Size: 96 00:07:33.611 Reclaim Group Identifier format: 2 00:07:33.611 FDP Volatile Write Cache: Not Present 00:07:33.611 FDP Configuration: Valid 00:07:33.611 Vendor Specific Size: 0 00:07:33.611 Number of Reclaim Groups: 2 00:07:33.611 Number of Recalim Unit Handles: 8 00:07:33.611 Max Placement Identifiers: 128 00:07:33.611 Number of Namespaces Suppprted: 256 00:07:33.611 Reclaim unit Nominal Size: 6000000 bytes 00:07:33.611 Estimated Reclaim Unit Time Limit: Not Reported 00:07:33.611 RUH Desc #000: RUH Type: Initially Isolated 00:07:33.611 RUH Desc #001: RUH Type: Initially Isolated 00:07:33.611 RUH Desc #002: RUH Type: Initially Isolated 00:07:33.611 RUH Desc #003: RUH Type: Initially Isolated 00:07:33.611 RUH Desc #004: RUH Type: Initially Isolated 00:07:33.611 RUH Desc #005: RUH Type: Initially Isolated 00:07:33.611 RUH Desc #006: RUH Type: Initially Isolated 00:07:33.611 RUH Desc #007: RUH Type: Initially Isolated 00:07:33.611 00:07:33.611 FDP reclaim unit handle usage log page 00:07:33.611 ====================================== 00:07:33.611 Number of Reclaim Unit Handles: 8 00:07:33.611 RUH Usage Desc #000: RUH Attributes: Controller Specified 00:07:33.611 RUH Usage Desc #001: RUH Attributes: Unused 00:07:33.611 RUH Usage Desc #002: RUH Attributes: Unused 00:07:33.611 RUH Usage Desc #003: RUH Attributes: Unused 00:07:33.611 RUH Usage Desc #004: RUH Attributes: Unused 00:07:33.611 RUH Usage Desc #005: RUH Attributes: Unused 00:07:33.611 RUH Usage Desc #006: RUH Attributes: Unused 00:07:33.611 RUH Usage Desc #007: RUH Attributes: Unused 00:07:33.611 00:07:33.611 FDP statistics log page 00:07:33.611 ======================= 00:07:33.611 Host bytes with metadata written: 567779328 00:07:33.611 Medi[2024-12-15 04:57:53.665517] nvme_ctrlr.c:3641:nvme_ctrlr_remove_inactive_proc: *ERROR*: [0000:00:12.0, 0] process 76476 terminated unexpected 00:07:33.611 a bytes with metadata written: 567857152 00:07:33.611 Media bytes erased: 0 00:07:33.611 00:07:33.611 FDP events log page 00:07:33.611 =================== 00:07:33.611 Number of FDP events: 0 00:07:33.611 00:07:33.611 NVM Specific Namespace Data 00:07:33.611 =========================== 00:07:33.611 Logical Block Storage Tag Mask: 0 00:07:33.611 Protection Information Capabilities: 00:07:33.611 16b Guard Protection Information Storage Tag Support: No 00:07:33.611 16b Guard Protection Information Storage Tag Mask: Any bit in LBSTM can be 0 00:07:33.611 Storage Tag Check Read Support: No 00:07:33.611 Extended LBA Format #00: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.611 Extended LBA Format #01: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.611 Extended LBA Format #02: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.611 Extended LBA Format #03: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.611 Extended LBA Format #04: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.612 Extended LBA Format #05: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.612 Extended LBA Format #06: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.612 Extended LBA Format #07: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.612 ===================================================== 00:07:33.612 NVMe Controller at 0000:00:12.0 [1b36:0010] 00:07:33.612 ===================================================== 00:07:33.612 Controller Capabilities/Features 00:07:33.612 ================================ 00:07:33.612 Vendor ID: 1b36 00:07:33.612 Subsystem Vendor ID: 1af4 00:07:33.612 Serial Number: 12342 00:07:33.612 Model Number: QEMU NVMe Ctrl 00:07:33.612 Firmware Version: 8.0.0 00:07:33.612 Recommended Arb Burst: 6 00:07:33.612 IEEE OUI Identifier: 00 54 52 00:07:33.612 Multi-path I/O 00:07:33.612 May have multiple subsystem ports: No 00:07:33.612 May have multiple controllers: No 00:07:33.612 Associated with SR-IOV VF: No 00:07:33.612 Max Data Transfer Size: 524288 00:07:33.612 Max Number of Namespaces: 256 00:07:33.612 Max Number of I/O Queues: 64 00:07:33.612 NVMe Specification Version (VS): 1.4 00:07:33.612 NVMe Specification Version (Identify): 1.4 00:07:33.612 Maximum Queue Entries: 2048 00:07:33.612 Contiguous Queues Required: Yes 00:07:33.612 Arbitration Mechanisms Supported 00:07:33.612 Weighted Round Robin: Not Supported 00:07:33.612 Vendor Specific: Not Supported 00:07:33.612 Reset Timeout: 7500 ms 00:07:33.612 Doorbell Stride: 4 bytes 00:07:33.612 NVM Subsystem Reset: Not Supported 00:07:33.612 Command Sets Supported 00:07:33.612 NVM Command Set: Supported 00:07:33.612 Boot Partition: Not Supported 00:07:33.612 Memory Page Size Minimum: 4096 bytes 00:07:33.612 Memory Page Size Maximum: 65536 bytes 00:07:33.612 Persistent Memory Region: Not Supported 00:07:33.612 Optional Asynchronous Events Supported 00:07:33.612 Namespace Attribute Notices: Supported 00:07:33.612 Firmware Activation Notices: Not Supported 00:07:33.612 ANA Change Notices: Not Supported 00:07:33.612 PLE Aggregate Log Change Notices: Not Supported 00:07:33.612 LBA Status Info Alert Notices: Not Supported 00:07:33.612 EGE Aggregate Log Change Notices: Not Supported 00:07:33.612 Normal NVM Subsystem Shutdown event: Not Supported 00:07:33.612 Zone Descriptor Change Notices: Not Supported 00:07:33.612 Discovery Log Change Notices: Not Supported 00:07:33.612 Controller Attributes 00:07:33.612 128-bit Host Identifier: Not Supported 00:07:33.612 Non-Operational Permissive Mode: Not Supported 00:07:33.612 NVM Sets: Not Supported 00:07:33.612 Read Recovery Levels: Not Supported 00:07:33.612 Endurance Groups: Not Supported 00:07:33.612 Predictable Latency Mode: Not Supported 00:07:33.612 Traffic Based Keep ALive: Not Supported 00:07:33.612 Namespace Granularity: Not Supported 00:07:33.612 SQ Associations: Not Supported 00:07:33.612 UUID List: Not Supported 00:07:33.612 Multi-Domain Subsystem: Not Supported 00:07:33.612 Fixed Capacity Management: Not Supported 00:07:33.612 Variable Capacity Management: Not Supported 00:07:33.612 Delete Endurance Group: Not Supported 00:07:33.612 Delete NVM Set: Not Supported 00:07:33.612 Extended LBA Formats Supported: Supported 00:07:33.612 Flexible Data Placement Supported: Not Supported 00:07:33.612 00:07:33.612 Controller Memory Buffer Support 00:07:33.612 ================================ 00:07:33.612 Supported: No 00:07:33.612 00:07:33.612 Persistent Memory Region Support 00:07:33.612 ================================ 00:07:33.612 Supported: No 00:07:33.612 00:07:33.612 Admin Command Set Attributes 00:07:33.612 ============================ 00:07:33.612 Security Send/Receive: Not Supported 00:07:33.612 Format NVM: Supported 00:07:33.612 Firmware Activate/Download: Not Supported 00:07:33.612 Namespace Management: Supported 00:07:33.612 Device Self-Test: Not Supported 00:07:33.612 Directives: Supported 00:07:33.612 NVMe-MI: Not Supported 00:07:33.612 Virtualization Management: Not Supported 00:07:33.612 Doorbell Buffer Config: Supported 00:07:33.612 Get LBA Status Capability: Not Supported 00:07:33.612 Command & Feature Lockdown Capability: Not Supported 00:07:33.612 Abort Command Limit: 4 00:07:33.612 Async Event Request Limit: 4 00:07:33.612 Number of Firmware Slots: N/A 00:07:33.612 Firmware Slot 1 Read-Only: N/A 00:07:33.612 Firmware Activation Without Reset: N/A 00:07:33.612 Multiple Update Detection Support: N/A 00:07:33.612 Firmware Update Granularity: No Information Provided 00:07:33.612 Per-Namespace SMART Log: Yes 00:07:33.612 Asymmetric Namespace Access Log Page: Not Supported 00:07:33.612 Subsystem NQN: nqn.2019-08.org.qemu:12342 00:07:33.612 Command Effects Log Page: Supported 00:07:33.612 Get Log Page Extended Data: Supported 00:07:33.612 Telemetry Log Pages: Not Supported 00:07:33.612 Persistent Event Log Pages: Not Supported 00:07:33.612 Supported Log Pages Log Page: May Support 00:07:33.612 Commands Supported & Effects Log Page: Not Supported 00:07:33.612 Feature Identifiers & Effects Log Page:May Support 00:07:33.612 NVMe-MI Commands & Effects Log Page: May Support 00:07:33.612 Data Area 4 for Telemetry Log: Not Supported 00:07:33.612 Error Log Page Entries Supported: 1 00:07:33.612 Keep Alive: Not Supported 00:07:33.612 00:07:33.612 NVM Command Set Attributes 00:07:33.612 ========================== 00:07:33.612 Submission Queue Entry Size 00:07:33.612 Max: 64 00:07:33.612 Min: 64 00:07:33.612 Completion Queue Entry Size 00:07:33.612 Max: 16 00:07:33.612 Min: 16 00:07:33.612 Number of Namespaces: 256 00:07:33.612 Compare Command: Supported 00:07:33.612 Write Uncorrectable Command: Not Supported 00:07:33.612 Dataset Management Command: Supported 00:07:33.612 Write Zeroes Command: Supported 00:07:33.612 Set Features Save Field: Supported 00:07:33.612 Reservations: Not Supported 00:07:33.612 Timestamp: Supported 00:07:33.612 Copy: Supported 00:07:33.612 Volatile Write Cache: Present 00:07:33.612 Atomic Write Unit (Normal): 1 00:07:33.612 Atomic Write Unit (PFail): 1 00:07:33.612 Atomic Compare & Write Unit: 1 00:07:33.612 Fused Compare & Write: Not Supported 00:07:33.612 Scatter-Gather List 00:07:33.612 SGL Command Set: Supported 00:07:33.612 SGL Keyed: Not Supported 00:07:33.612 SGL Bit Bucket Descriptor: Not Supported 00:07:33.612 SGL Metadata Pointer: Not Supported 00:07:33.612 Oversized SGL: Not Supported 00:07:33.612 SGL Metadata Address: Not Supported 00:07:33.612 SGL Offset: Not Supported 00:07:33.612 Transport SGL Data Block: Not Supported 00:07:33.612 Replay Protected Memory Block: Not Supported 00:07:33.612 00:07:33.612 Firmware Slot Information 00:07:33.612 ========================= 00:07:33.612 Active slot: 1 00:07:33.612 Slot 1 Firmware Revision: 1.0 00:07:33.612 00:07:33.612 00:07:33.612 Commands Supported and Effects 00:07:33.612 ============================== 00:07:33.612 Admin Commands 00:07:33.612 -------------- 00:07:33.612 Delete I/O Submission Queue (00h): Supported 00:07:33.612 Create I/O Submission Queue (01h): Supported 00:07:33.612 Get Log Page (02h): Supported 00:07:33.612 Delete I/O Completion Queue (04h): Supported 00:07:33.612 Create I/O Completion Queue (05h): Supported 00:07:33.612 Identify (06h): Supported 00:07:33.612 Abort (08h): Supported 00:07:33.612 Set Features (09h): Supported 00:07:33.612 Get Features (0Ah): Supported 00:07:33.612 Asynchronous Event Request (0Ch): Supported 00:07:33.612 Namespace Attachment (15h): Supported NS-Inventory-Change 00:07:33.612 Directive Send (19h): Supported 00:07:33.612 Directive Receive (1Ah): Supported 00:07:33.612 Virtualization Management (1Ch): Supported 00:07:33.612 Doorbell Buffer Config (7Ch): Supported 00:07:33.612 Format NVM (80h): Supported LBA-Change 00:07:33.612 I/O Commands 00:07:33.612 ------------ 00:07:33.612 Flush (00h): Supported LBA-Change 00:07:33.613 Write (01h): Supported LBA-Change 00:07:33.613 Read (02h): Supported 00:07:33.613 Compare (05h): Supported 00:07:33.613 Write Zeroes (08h): Supported LBA-Change 00:07:33.613 Dataset Management (09h): Supported LBA-Change 00:07:33.613 Unknown (0Ch): Supported 00:07:33.613 Unknown (12h): Supported 00:07:33.613 Copy (19h): Supported LBA-Change 00:07:33.613 Unknown (1Dh): Supported LBA-Change 00:07:33.613 00:07:33.613 Error Log 00:07:33.613 ========= 00:07:33.613 00:07:33.613 Arbitration 00:07:33.613 =========== 00:07:33.613 Arbitration Burst: no limit 00:07:33.613 00:07:33.613 Power Management 00:07:33.613 ================ 00:07:33.613 Number of Power States: 1 00:07:33.613 Current Power State: Power State #0 00:07:33.613 Power State #0: 00:07:33.613 Max Power: 25.00 W 00:07:33.613 Non-Operational State: Operational 00:07:33.613 Entry Latency: 16 microseconds 00:07:33.613 Exit Latency: 4 microseconds 00:07:33.613 Relative Read Throughput: 0 00:07:33.613 Relative Read Latency: 0 00:07:33.613 Relative Write Throughput: 0 00:07:33.613 Relative Write Latency: 0 00:07:33.613 Idle Power: Not Reported 00:07:33.613 Active Power: Not Reported 00:07:33.613 Non-Operational Permissive Mode: Not Supported 00:07:33.613 00:07:33.613 Health Information 00:07:33.613 ================== 00:07:33.613 Critical Warnings: 00:07:33.613 Available Spare Space: OK 00:07:33.613 Temperature: OK 00:07:33.613 Device Reliability: OK 00:07:33.613 Read Only: No 00:07:33.613 Volatile Memory Backup: OK 00:07:33.613 Current Temperature: 323 Kelvin (50 Celsius) 00:07:33.613 Temperature Threshold: 343 Kelvin (70 Celsius) 00:07:33.613 Available Spare: 0% 00:07:33.613 Available Spare Threshold: 0% 00:07:33.613 Life Percentage Used: 0% 00:07:33.613 Data Units Read: 2233 00:07:33.613 Data Units Written: 2020 00:07:33.613 Host Read Commands: 104082 00:07:33.613 Host Write Commands: 102353 00:07:33.613 Controller Busy Time: 0 minutes 00:07:33.613 Power Cycles: 0 00:07:33.613 Power On Hours: 0 hours 00:07:33.613 Unsafe Shutdowns: 0 00:07:33.613 Unrecoverable Media Errors: 0 00:07:33.613 Lifetime Error Log Entries: 0 00:07:33.613 Warning Temperature Time: 0 minutes 00:07:33.613 Critical Temperature Time: 0 minutes 00:07:33.613 00:07:33.613 Number of Queues 00:07:33.613 ================ 00:07:33.613 Number of I/O Submission Queues: 64 00:07:33.613 Number of I/O Completion Queues: 64 00:07:33.613 00:07:33.613 ZNS Specific Controller Data 00:07:33.613 ============================ 00:07:33.613 Zone Append Size Limit: 0 00:07:33.613 00:07:33.613 00:07:33.613 Active Namespaces 00:07:33.613 ================= 00:07:33.613 Namespace ID:1 00:07:33.613 Error Recovery Timeout: Unlimited 00:07:33.613 Command Set Identifier: NVM (00h) 00:07:33.613 Deallocate: Supported 00:07:33.613 Deallocated/Unwritten Error: Supported 00:07:33.613 Deallocated Read Value: All 0x00 00:07:33.613 Deallocate in Write Zeroes: Not Supported 00:07:33.613 Deallocated Guard Field: 0xFFFF 00:07:33.613 Flush: Supported 00:07:33.613 Reservation: Not Supported 00:07:33.613 Namespace Sharing Capabilities: Private 00:07:33.613 Size (in LBAs): 1048576 (4GiB) 00:07:33.613 Capacity (in LBAs): 1048576 (4GiB) 00:07:33.613 Utilization (in LBAs): 1048576 (4GiB) 00:07:33.613 Thin Provisioning: Not Supported 00:07:33.613 Per-NS Atomic Units: No 00:07:33.613 Maximum Single Source Range Length: 128 00:07:33.613 Maximum Copy Length: 128 00:07:33.613 Maximum Source Range Count: 128 00:07:33.613 NGUID/EUI64 Never Reused: No 00:07:33.613 Namespace Write Protected: No 00:07:33.613 Number of LBA Formats: 8 00:07:33.613 Current LBA Format: LBA Format #04 00:07:33.613 LBA Format #00: Data Size: 512 Metadata Size: 0 00:07:33.613 LBA Format #01: Data Size: 512 Metadata Size: 8 00:07:33.613 LBA Format #02: Data Size: 512 Metadata Size: 16 00:07:33.613 LBA Format #03: Data Size: 512 Metadata Size: 64 00:07:33.613 LBA Format #04: Data Size: 4096 Metadata Size: 0 00:07:33.613 LBA Format #05: Data Size: 4096 Metadata Size: 8 00:07:33.613 LBA Format #06: Data Size: 4096 Metadata Size: 16 00:07:33.613 LBA Format #07: Data Size: 4096 Metadata Size: 64 00:07:33.613 00:07:33.613 NVM Specific Namespace Data 00:07:33.613 =========================== 00:07:33.613 Logical Block Storage Tag Mask: 0 00:07:33.613 Protection Information Capabilities: 00:07:33.613 16b Guard Protection Information Storage Tag Support: No 00:07:33.613 16b Guard Protection Information Storage Tag Mask: Any bit in LBSTM can be 0 00:07:33.613 Storage Tag Check Read Support: No 00:07:33.613 Extended LBA Format #00: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Extended LBA Format #01: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Extended LBA Format #02: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Extended LBA Format #03: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Extended LBA Format #04: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Extended LBA Format #05: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Extended LBA Format #06: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Extended LBA Format #07: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Namespace ID:2 00:07:33.613 Error Recovery Timeout: Unlimited 00:07:33.613 Command Set Identifier: NVM (00h) 00:07:33.613 Deallocate: Supported 00:07:33.613 Deallocated/Unwritten Error: Supported 00:07:33.613 Deallocated Read Value: All 0x00 00:07:33.613 Deallocate in Write Zeroes: Not Supported 00:07:33.613 Deallocated Guard Field: 0xFFFF 00:07:33.613 Flush: Supported 00:07:33.613 Reservation: Not Supported 00:07:33.613 Namespace Sharing Capabilities: Private 00:07:33.613 Size (in LBAs): 1048576 (4GiB) 00:07:33.613 Capacity (in LBAs): 1048576 (4GiB) 00:07:33.613 Utilization (in LBAs): 1048576 (4GiB) 00:07:33.613 Thin Provisioning: Not Supported 00:07:33.613 Per-NS Atomic Units: No 00:07:33.613 Maximum Single Source Range Length: 128 00:07:33.613 Maximum Copy Length: 128 00:07:33.613 Maximum Source Range Count: 128 00:07:33.613 NGUID/EUI64 Never Reused: No 00:07:33.613 Namespace Write Protected: No 00:07:33.613 Number of LBA Formats: 8 00:07:33.613 Current LBA Format: LBA Format #04 00:07:33.613 LBA Format #00: Data Size: 512 Metadata Size: 0 00:07:33.613 LBA Format #01: Data Size: 512 Metadata Size: 8 00:07:33.613 LBA Format #02: Data Size: 512 Metadata Size: 16 00:07:33.613 LBA Format #03: Data Size: 512 Metadata Size: 64 00:07:33.613 LBA Format #04: Data Size: 4096 Metadata Size: 0 00:07:33.613 LBA Format #05: Data Size: 4096 Metadata Size: 8 00:07:33.613 LBA Format #06: Data Size: 4096 Metadata Size: 16 00:07:33.613 LBA Format #07: Data Size: 4096 Metadata Size: 64 00:07:33.613 00:07:33.613 NVM Specific Namespace Data 00:07:33.613 =========================== 00:07:33.613 Logical Block Storage Tag Mask: 0 00:07:33.613 Protection Information Capabilities: 00:07:33.613 16b Guard Protection Information Storage Tag Support: No 00:07:33.613 16b Guard Protection Information Storage Tag Mask: Any bit in LBSTM can be 0 00:07:33.613 Storage Tag Check Read Support: No 00:07:33.613 Extended LBA Format #00: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Extended LBA Format #01: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Extended LBA Format #02: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Extended LBA Format #03: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Extended LBA Format #04: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Extended LBA Format #05: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Extended LBA Format #06: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Extended LBA Format #07: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.613 Namespace ID:3 00:07:33.613 Error Recovery Timeout: Unlimited 00:07:33.613 Command Set Identifier: NVM (00h) 00:07:33.613 Deallocate: Supported 00:07:33.613 Deallocated/Unwritten Error: Supported 00:07:33.613 Deallocated Read Value: All 0x00 00:07:33.613 Deallocate in Write Zeroes: Not Supported 00:07:33.613 Deallocated Guard Field: 0xFFFF 00:07:33.613 Flush: Supported 00:07:33.613 Reservation: Not Supported 00:07:33.613 Namespace Sharing Capabilities: Private 00:07:33.613 Size (in LBAs): 1048576 (4GiB) 00:07:33.613 Capacity (in LBAs): 1048576 (4GiB) 00:07:33.613 Utilization (in LBAs): 1048576 (4GiB) 00:07:33.613 Thin Provisioning: Not Supported 00:07:33.613 Per-NS Atomic Units: No 00:07:33.613 Maximum Single Source Range Length: 128 00:07:33.613 Maximum Copy Length: 128 00:07:33.613 Maximum Source Range Count: 128 00:07:33.613 NGUID/EUI64 Never Reused: No 00:07:33.613 Namespace Write Protected: No 00:07:33.613 Number of LBA Formats: 8 00:07:33.613 Current LBA Format: LBA Format #04 00:07:33.613 LBA Format #00: Data Size: 512 Metadata Size: 0 00:07:33.613 LBA Format #01: Data Size: 512 Metadata Size: 8 00:07:33.613 LBA Format #02: Data Size: 512 Metadata Size: 16 00:07:33.613 LBA Format #03: Data Size: 512 Metadata Size: 64 00:07:33.613 LBA Format #04: Data Size: 4096 Metadata Size: 0 00:07:33.613 LBA Format #05: Data Size: 4096 Metadata Size: 8 00:07:33.613 LBA Format #06: Data Size: 4096 Metadata Size: 16 00:07:33.613 LBA Format #07: Data Size: 4096 Metadata Size: 64 00:07:33.613 00:07:33.613 NVM Specific Namespace Data 00:07:33.613 =========================== 00:07:33.614 Logical Block Storage Tag Mask: 0 00:07:33.614 Protection Information Capabilities: 00:07:33.614 16b Guard Protection Information Storage Tag Support: No 00:07:33.614 16b Guard Protection Information Storage Tag Mask: Any bit in LBSTM can be 0 00:07:33.614 Storage Tag Check Read Support: No 00:07:33.614 Extended LBA Format #00: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.614 Extended LBA Format #01: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.614 Extended LBA Format #02: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.614 Extended LBA Format #03: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.614 Extended LBA Format #04: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.614 Extended LBA Format #05: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.614 Extended LBA Format #06: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.614 Extended LBA Format #07: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.614 04:57:53 nvme.nvme_identify -- nvme/nvme.sh@15 -- # for bdf in "${bdfs[@]}" 00:07:33.614 04:57:53 nvme.nvme_identify -- nvme/nvme.sh@16 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_identify -r 'trtype:PCIe traddr:0000:00:10.0' -i 0 00:07:33.873 ===================================================== 00:07:33.873 NVMe Controller at 0000:00:10.0 [1b36:0010] 00:07:33.873 ===================================================== 00:07:33.873 Controller Capabilities/Features 00:07:33.873 ================================ 00:07:33.873 Vendor ID: 1b36 00:07:33.873 Subsystem Vendor ID: 1af4 00:07:33.873 Serial Number: 12340 00:07:33.873 Model Number: QEMU NVMe Ctrl 00:07:33.873 Firmware Version: 8.0.0 00:07:33.873 Recommended Arb Burst: 6 00:07:33.873 IEEE OUI Identifier: 00 54 52 00:07:33.873 Multi-path I/O 00:07:33.873 May have multiple subsystem ports: No 00:07:33.873 May have multiple controllers: No 00:07:33.873 Associated with SR-IOV VF: No 00:07:33.873 Max Data Transfer Size: 524288 00:07:33.873 Max Number of Namespaces: 256 00:07:33.873 Max Number of I/O Queues: 64 00:07:33.873 NVMe Specification Version (VS): 1.4 00:07:33.873 NVMe Specification Version (Identify): 1.4 00:07:33.873 Maximum Queue Entries: 2048 00:07:33.873 Contiguous Queues Required: Yes 00:07:33.873 Arbitration Mechanisms Supported 00:07:33.873 Weighted Round Robin: Not Supported 00:07:33.873 Vendor Specific: Not Supported 00:07:33.873 Reset Timeout: 7500 ms 00:07:33.873 Doorbell Stride: 4 bytes 00:07:33.873 NVM Subsystem Reset: Not Supported 00:07:33.873 Command Sets Supported 00:07:33.873 NVM Command Set: Supported 00:07:33.873 Boot Partition: Not Supported 00:07:33.873 Memory Page Size Minimum: 4096 bytes 00:07:33.873 Memory Page Size Maximum: 65536 bytes 00:07:33.873 Persistent Memory Region: Not Supported 00:07:33.873 Optional Asynchronous Events Supported 00:07:33.873 Namespace Attribute Notices: Supported 00:07:33.873 Firmware Activation Notices: Not Supported 00:07:33.873 ANA Change Notices: Not Supported 00:07:33.873 PLE Aggregate Log Change Notices: Not Supported 00:07:33.873 LBA Status Info Alert Notices: Not Supported 00:07:33.873 EGE Aggregate Log Change Notices: Not Supported 00:07:33.873 Normal NVM Subsystem Shutdown event: Not Supported 00:07:33.873 Zone Descriptor Change Notices: Not Supported 00:07:33.873 Discovery Log Change Notices: Not Supported 00:07:33.873 Controller Attributes 00:07:33.873 128-bit Host Identifier: Not Supported 00:07:33.873 Non-Operational Permissive Mode: Not Supported 00:07:33.873 NVM Sets: Not Supported 00:07:33.873 Read Recovery Levels: Not Supported 00:07:33.873 Endurance Groups: Not Supported 00:07:33.873 Predictable Latency Mode: Not Supported 00:07:33.873 Traffic Based Keep ALive: Not Supported 00:07:33.873 Namespace Granularity: Not Supported 00:07:33.873 SQ Associations: Not Supported 00:07:33.873 UUID List: Not Supported 00:07:33.873 Multi-Domain Subsystem: Not Supported 00:07:33.873 Fixed Capacity Management: Not Supported 00:07:33.873 Variable Capacity Management: Not Supported 00:07:33.873 Delete Endurance Group: Not Supported 00:07:33.873 Delete NVM Set: Not Supported 00:07:33.873 Extended LBA Formats Supported: Supported 00:07:33.873 Flexible Data Placement Supported: Not Supported 00:07:33.873 00:07:33.873 Controller Memory Buffer Support 00:07:33.873 ================================ 00:07:33.873 Supported: No 00:07:33.873 00:07:33.873 Persistent Memory Region Support 00:07:33.873 ================================ 00:07:33.873 Supported: No 00:07:33.873 00:07:33.873 Admin Command Set Attributes 00:07:33.873 ============================ 00:07:33.873 Security Send/Receive: Not Supported 00:07:33.874 Format NVM: Supported 00:07:33.874 Firmware Activate/Download: Not Supported 00:07:33.874 Namespace Management: Supported 00:07:33.874 Device Self-Test: Not Supported 00:07:33.874 Directives: Supported 00:07:33.874 NVMe-MI: Not Supported 00:07:33.874 Virtualization Management: Not Supported 00:07:33.874 Doorbell Buffer Config: Supported 00:07:33.874 Get LBA Status Capability: Not Supported 00:07:33.874 Command & Feature Lockdown Capability: Not Supported 00:07:33.874 Abort Command Limit: 4 00:07:33.874 Async Event Request Limit: 4 00:07:33.874 Number of Firmware Slots: N/A 00:07:33.874 Firmware Slot 1 Read-Only: N/A 00:07:33.874 Firmware Activation Without Reset: N/A 00:07:33.874 Multiple Update Detection Support: N/A 00:07:33.874 Firmware Update Granularity: No Information Provided 00:07:33.874 Per-Namespace SMART Log: Yes 00:07:33.874 Asymmetric Namespace Access Log Page: Not Supported 00:07:33.874 Subsystem NQN: nqn.2019-08.org.qemu:12340 00:07:33.874 Command Effects Log Page: Supported 00:07:33.874 Get Log Page Extended Data: Supported 00:07:33.874 Telemetry Log Pages: Not Supported 00:07:33.874 Persistent Event Log Pages: Not Supported 00:07:33.874 Supported Log Pages Log Page: May Support 00:07:33.874 Commands Supported & Effects Log Page: Not Supported 00:07:33.874 Feature Identifiers & Effects Log Page:May Support 00:07:33.874 NVMe-MI Commands & Effects Log Page: May Support 00:07:33.874 Data Area 4 for Telemetry Log: Not Supported 00:07:33.874 Error Log Page Entries Supported: 1 00:07:33.874 Keep Alive: Not Supported 00:07:33.874 00:07:33.874 NVM Command Set Attributes 00:07:33.874 ========================== 00:07:33.874 Submission Queue Entry Size 00:07:33.874 Max: 64 00:07:33.874 Min: 64 00:07:33.874 Completion Queue Entry Size 00:07:33.874 Max: 16 00:07:33.874 Min: 16 00:07:33.874 Number of Namespaces: 256 00:07:33.874 Compare Command: Supported 00:07:33.874 Write Uncorrectable Command: Not Supported 00:07:33.874 Dataset Management Command: Supported 00:07:33.874 Write Zeroes Command: Supported 00:07:33.874 Set Features Save Field: Supported 00:07:33.874 Reservations: Not Supported 00:07:33.874 Timestamp: Supported 00:07:33.874 Copy: Supported 00:07:33.874 Volatile Write Cache: Present 00:07:33.874 Atomic Write Unit (Normal): 1 00:07:33.874 Atomic Write Unit (PFail): 1 00:07:33.874 Atomic Compare & Write Unit: 1 00:07:33.874 Fused Compare & Write: Not Supported 00:07:33.874 Scatter-Gather List 00:07:33.874 SGL Command Set: Supported 00:07:33.874 SGL Keyed: Not Supported 00:07:33.874 SGL Bit Bucket Descriptor: Not Supported 00:07:33.874 SGL Metadata Pointer: Not Supported 00:07:33.874 Oversized SGL: Not Supported 00:07:33.874 SGL Metadata Address: Not Supported 00:07:33.874 SGL Offset: Not Supported 00:07:33.874 Transport SGL Data Block: Not Supported 00:07:33.874 Replay Protected Memory Block: Not Supported 00:07:33.874 00:07:33.874 Firmware Slot Information 00:07:33.874 ========================= 00:07:33.874 Active slot: 1 00:07:33.874 Slot 1 Firmware Revision: 1.0 00:07:33.874 00:07:33.874 00:07:33.874 Commands Supported and Effects 00:07:33.874 ============================== 00:07:33.874 Admin Commands 00:07:33.874 -------------- 00:07:33.874 Delete I/O Submission Queue (00h): Supported 00:07:33.874 Create I/O Submission Queue (01h): Supported 00:07:33.874 Get Log Page (02h): Supported 00:07:33.874 Delete I/O Completion Queue (04h): Supported 00:07:33.874 Create I/O Completion Queue (05h): Supported 00:07:33.874 Identify (06h): Supported 00:07:33.874 Abort (08h): Supported 00:07:33.874 Set Features (09h): Supported 00:07:33.874 Get Features (0Ah): Supported 00:07:33.874 Asynchronous Event Request (0Ch): Supported 00:07:33.874 Namespace Attachment (15h): Supported NS-Inventory-Change 00:07:33.874 Directive Send (19h): Supported 00:07:33.874 Directive Receive (1Ah): Supported 00:07:33.874 Virtualization Management (1Ch): Supported 00:07:33.874 Doorbell Buffer Config (7Ch): Supported 00:07:33.874 Format NVM (80h): Supported LBA-Change 00:07:33.874 I/O Commands 00:07:33.874 ------------ 00:07:33.874 Flush (00h): Supported LBA-Change 00:07:33.874 Write (01h): Supported LBA-Change 00:07:33.874 Read (02h): Supported 00:07:33.874 Compare (05h): Supported 00:07:33.874 Write Zeroes (08h): Supported LBA-Change 00:07:33.874 Dataset Management (09h): Supported LBA-Change 00:07:33.874 Unknown (0Ch): Supported 00:07:33.874 Unknown (12h): Supported 00:07:33.874 Copy (19h): Supported LBA-Change 00:07:33.874 Unknown (1Dh): Supported LBA-Change 00:07:33.874 00:07:33.874 Error Log 00:07:33.874 ========= 00:07:33.874 00:07:33.874 Arbitration 00:07:33.874 =========== 00:07:33.874 Arbitration Burst: no limit 00:07:33.874 00:07:33.874 Power Management 00:07:33.874 ================ 00:07:33.874 Number of Power States: 1 00:07:33.874 Current Power State: Power State #0 00:07:33.874 Power State #0: 00:07:33.874 Max Power: 25.00 W 00:07:33.874 Non-Operational State: Operational 00:07:33.874 Entry Latency: 16 microseconds 00:07:33.874 Exit Latency: 4 microseconds 00:07:33.874 Relative Read Throughput: 0 00:07:33.874 Relative Read Latency: 0 00:07:33.874 Relative Write Throughput: 0 00:07:33.874 Relative Write Latency: 0 00:07:33.874 Idle Power: Not Reported 00:07:33.874 Active Power: Not Reported 00:07:33.874 Non-Operational Permissive Mode: Not Supported 00:07:33.874 00:07:33.874 Health Information 00:07:33.874 ================== 00:07:33.874 Critical Warnings: 00:07:33.874 Available Spare Space: OK 00:07:33.874 Temperature: OK 00:07:33.874 Device Reliability: OK 00:07:33.874 Read Only: No 00:07:33.874 Volatile Memory Backup: OK 00:07:33.874 Current Temperature: 323 Kelvin (50 Celsius) 00:07:33.874 Temperature Threshold: 343 Kelvin (70 Celsius) 00:07:33.874 Available Spare: 0% 00:07:33.874 Available Spare Threshold: 0% 00:07:33.874 Life Percentage Used: 0% 00:07:33.874 Data Units Read: 640 00:07:33.874 Data Units Written: 568 00:07:33.874 Host Read Commands: 33519 00:07:33.874 Host Write Commands: 33305 00:07:33.874 Controller Busy Time: 0 minutes 00:07:33.874 Power Cycles: 0 00:07:33.874 Power On Hours: 0 hours 00:07:33.874 Unsafe Shutdowns: 0 00:07:33.874 Unrecoverable Media Errors: 0 00:07:33.874 Lifetime Error Log Entries: 0 00:07:33.874 Warning Temperature Time: 0 minutes 00:07:33.874 Critical Temperature Time: 0 minutes 00:07:33.874 00:07:33.874 Number of Queues 00:07:33.874 ================ 00:07:33.874 Number of I/O Submission Queues: 64 00:07:33.874 Number of I/O Completion Queues: 64 00:07:33.874 00:07:33.874 ZNS Specific Controller Data 00:07:33.874 ============================ 00:07:33.874 Zone Append Size Limit: 0 00:07:33.874 00:07:33.874 00:07:33.874 Active Namespaces 00:07:33.874 ================= 00:07:33.874 Namespace ID:1 00:07:33.874 Error Recovery Timeout: Unlimited 00:07:33.874 Command Set Identifier: NVM (00h) 00:07:33.874 Deallocate: Supported 00:07:33.874 Deallocated/Unwritten Error: Supported 00:07:33.874 Deallocated Read Value: All 0x00 00:07:33.874 Deallocate in Write Zeroes: Not Supported 00:07:33.874 Deallocated Guard Field: 0xFFFF 00:07:33.874 Flush: Supported 00:07:33.874 Reservation: Not Supported 00:07:33.874 Metadata Transferred as: Separate Metadata Buffer 00:07:33.874 Namespace Sharing Capabilities: Private 00:07:33.874 Size (in LBAs): 1548666 (5GiB) 00:07:33.874 Capacity (in LBAs): 1548666 (5GiB) 00:07:33.874 Utilization (in LBAs): 1548666 (5GiB) 00:07:33.874 Thin Provisioning: Not Supported 00:07:33.874 Per-NS Atomic Units: No 00:07:33.874 Maximum Single Source Range Length: 128 00:07:33.874 Maximum Copy Length: 128 00:07:33.874 Maximum Source Range Count: 128 00:07:33.874 NGUID/EUI64 Never Reused: No 00:07:33.874 Namespace Write Protected: No 00:07:33.874 Number of LBA Formats: 8 00:07:33.874 Current LBA Format: LBA Format #07 00:07:33.874 LBA Format #00: Data Size: 512 Metadata Size: 0 00:07:33.874 LBA Format #01: Data Size: 512 Metadata Size: 8 00:07:33.874 LBA Format #02: Data Size: 512 Metadata Size: 16 00:07:33.874 LBA Format #03: Data Size: 512 Metadata Size: 64 00:07:33.874 LBA Format #04: Data Size: 4096 Metadata Size: 0 00:07:33.874 LBA Format #05: Data Size: 4096 Metadata Size: 8 00:07:33.874 LBA Format #06: Data Size: 4096 Metadata Size: 16 00:07:33.874 LBA Format #07: Data Size: 4096 Metadata Size: 64 00:07:33.874 00:07:33.874 NVM Specific Namespace Data 00:07:33.874 =========================== 00:07:33.874 Logical Block Storage Tag Mask: 0 00:07:33.874 Protection Information Capabilities: 00:07:33.874 16b Guard Protection Information Storage Tag Support: No 00:07:33.874 16b Guard Protection Information Storage Tag Mask: Any bit in LBSTM can be 0 00:07:33.874 Storage Tag Check Read Support: No 00:07:33.874 Extended LBA Format #00: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.874 Extended LBA Format #01: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.875 Extended LBA Format #02: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.875 Extended LBA Format #03: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.875 Extended LBA Format #04: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.875 Extended LBA Format #05: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.875 Extended LBA Format #06: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.875 Extended LBA Format #07: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:33.875 04:57:53 nvme.nvme_identify -- nvme/nvme.sh@15 -- # for bdf in "${bdfs[@]}" 00:07:33.875 04:57:53 nvme.nvme_identify -- nvme/nvme.sh@16 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_identify -r 'trtype:PCIe traddr:0000:00:11.0' -i 0 00:07:34.134 ===================================================== 00:07:34.134 NVMe Controller at 0000:00:11.0 [1b36:0010] 00:07:34.134 ===================================================== 00:07:34.134 Controller Capabilities/Features 00:07:34.134 ================================ 00:07:34.134 Vendor ID: 1b36 00:07:34.134 Subsystem Vendor ID: 1af4 00:07:34.134 Serial Number: 12341 00:07:34.134 Model Number: QEMU NVMe Ctrl 00:07:34.134 Firmware Version: 8.0.0 00:07:34.134 Recommended Arb Burst: 6 00:07:34.134 IEEE OUI Identifier: 00 54 52 00:07:34.134 Multi-path I/O 00:07:34.134 May have multiple subsystem ports: No 00:07:34.134 May have multiple controllers: No 00:07:34.134 Associated with SR-IOV VF: No 00:07:34.134 Max Data Transfer Size: 524288 00:07:34.134 Max Number of Namespaces: 256 00:07:34.134 Max Number of I/O Queues: 64 00:07:34.134 NVMe Specification Version (VS): 1.4 00:07:34.134 NVMe Specification Version (Identify): 1.4 00:07:34.134 Maximum Queue Entries: 2048 00:07:34.134 Contiguous Queues Required: Yes 00:07:34.134 Arbitration Mechanisms Supported 00:07:34.134 Weighted Round Robin: Not Supported 00:07:34.134 Vendor Specific: Not Supported 00:07:34.134 Reset Timeout: 7500 ms 00:07:34.134 Doorbell Stride: 4 bytes 00:07:34.134 NVM Subsystem Reset: Not Supported 00:07:34.134 Command Sets Supported 00:07:34.134 NVM Command Set: Supported 00:07:34.134 Boot Partition: Not Supported 00:07:34.134 Memory Page Size Minimum: 4096 bytes 00:07:34.134 Memory Page Size Maximum: 65536 bytes 00:07:34.134 Persistent Memory Region: Not Supported 00:07:34.134 Optional Asynchronous Events Supported 00:07:34.134 Namespace Attribute Notices: Supported 00:07:34.134 Firmware Activation Notices: Not Supported 00:07:34.134 ANA Change Notices: Not Supported 00:07:34.134 PLE Aggregate Log Change Notices: Not Supported 00:07:34.134 LBA Status Info Alert Notices: Not Supported 00:07:34.134 EGE Aggregate Log Change Notices: Not Supported 00:07:34.134 Normal NVM Subsystem Shutdown event: Not Supported 00:07:34.134 Zone Descriptor Change Notices: Not Supported 00:07:34.134 Discovery Log Change Notices: Not Supported 00:07:34.134 Controller Attributes 00:07:34.134 128-bit Host Identifier: Not Supported 00:07:34.134 Non-Operational Permissive Mode: Not Supported 00:07:34.134 NVM Sets: Not Supported 00:07:34.134 Read Recovery Levels: Not Supported 00:07:34.134 Endurance Groups: Not Supported 00:07:34.134 Predictable Latency Mode: Not Supported 00:07:34.134 Traffic Based Keep ALive: Not Supported 00:07:34.134 Namespace Granularity: Not Supported 00:07:34.134 SQ Associations: Not Supported 00:07:34.134 UUID List: Not Supported 00:07:34.134 Multi-Domain Subsystem: Not Supported 00:07:34.134 Fixed Capacity Management: Not Supported 00:07:34.134 Variable Capacity Management: Not Supported 00:07:34.134 Delete Endurance Group: Not Supported 00:07:34.134 Delete NVM Set: Not Supported 00:07:34.134 Extended LBA Formats Supported: Supported 00:07:34.134 Flexible Data Placement Supported: Not Supported 00:07:34.134 00:07:34.134 Controller Memory Buffer Support 00:07:34.134 ================================ 00:07:34.134 Supported: No 00:07:34.134 00:07:34.135 Persistent Memory Region Support 00:07:34.135 ================================ 00:07:34.135 Supported: No 00:07:34.135 00:07:34.135 Admin Command Set Attributes 00:07:34.135 ============================ 00:07:34.135 Security Send/Receive: Not Supported 00:07:34.135 Format NVM: Supported 00:07:34.135 Firmware Activate/Download: Not Supported 00:07:34.135 Namespace Management: Supported 00:07:34.135 Device Self-Test: Not Supported 00:07:34.135 Directives: Supported 00:07:34.135 NVMe-MI: Not Supported 00:07:34.135 Virtualization Management: Not Supported 00:07:34.135 Doorbell Buffer Config: Supported 00:07:34.135 Get LBA Status Capability: Not Supported 00:07:34.135 Command & Feature Lockdown Capability: Not Supported 00:07:34.135 Abort Command Limit: 4 00:07:34.135 Async Event Request Limit: 4 00:07:34.135 Number of Firmware Slots: N/A 00:07:34.135 Firmware Slot 1 Read-Only: N/A 00:07:34.135 Firmware Activation Without Reset: N/A 00:07:34.135 Multiple Update Detection Support: N/A 00:07:34.135 Firmware Update Granularity: No Information Provided 00:07:34.135 Per-Namespace SMART Log: Yes 00:07:34.135 Asymmetric Namespace Access Log Page: Not Supported 00:07:34.135 Subsystem NQN: nqn.2019-08.org.qemu:12341 00:07:34.135 Command Effects Log Page: Supported 00:07:34.135 Get Log Page Extended Data: Supported 00:07:34.135 Telemetry Log Pages: Not Supported 00:07:34.135 Persistent Event Log Pages: Not Supported 00:07:34.135 Supported Log Pages Log Page: May Support 00:07:34.135 Commands Supported & Effects Log Page: Not Supported 00:07:34.135 Feature Identifiers & Effects Log Page:May Support 00:07:34.135 NVMe-MI Commands & Effects Log Page: May Support 00:07:34.135 Data Area 4 for Telemetry Log: Not Supported 00:07:34.135 Error Log Page Entries Supported: 1 00:07:34.135 Keep Alive: Not Supported 00:07:34.135 00:07:34.135 NVM Command Set Attributes 00:07:34.135 ========================== 00:07:34.135 Submission Queue Entry Size 00:07:34.135 Max: 64 00:07:34.135 Min: 64 00:07:34.135 Completion Queue Entry Size 00:07:34.135 Max: 16 00:07:34.135 Min: 16 00:07:34.135 Number of Namespaces: 256 00:07:34.135 Compare Command: Supported 00:07:34.135 Write Uncorrectable Command: Not Supported 00:07:34.135 Dataset Management Command: Supported 00:07:34.135 Write Zeroes Command: Supported 00:07:34.135 Set Features Save Field: Supported 00:07:34.135 Reservations: Not Supported 00:07:34.135 Timestamp: Supported 00:07:34.135 Copy: Supported 00:07:34.135 Volatile Write Cache: Present 00:07:34.135 Atomic Write Unit (Normal): 1 00:07:34.135 Atomic Write Unit (PFail): 1 00:07:34.135 Atomic Compare & Write Unit: 1 00:07:34.135 Fused Compare & Write: Not Supported 00:07:34.135 Scatter-Gather List 00:07:34.135 SGL Command Set: Supported 00:07:34.135 SGL Keyed: Not Supported 00:07:34.135 SGL Bit Bucket Descriptor: Not Supported 00:07:34.135 SGL Metadata Pointer: Not Supported 00:07:34.135 Oversized SGL: Not Supported 00:07:34.135 SGL Metadata Address: Not Supported 00:07:34.135 SGL Offset: Not Supported 00:07:34.135 Transport SGL Data Block: Not Supported 00:07:34.135 Replay Protected Memory Block: Not Supported 00:07:34.135 00:07:34.135 Firmware Slot Information 00:07:34.135 ========================= 00:07:34.135 Active slot: 1 00:07:34.135 Slot 1 Firmware Revision: 1.0 00:07:34.135 00:07:34.135 00:07:34.135 Commands Supported and Effects 00:07:34.135 ============================== 00:07:34.135 Admin Commands 00:07:34.135 -------------- 00:07:34.135 Delete I/O Submission Queue (00h): Supported 00:07:34.135 Create I/O Submission Queue (01h): Supported 00:07:34.135 Get Log Page (02h): Supported 00:07:34.135 Delete I/O Completion Queue (04h): Supported 00:07:34.135 Create I/O Completion Queue (05h): Supported 00:07:34.135 Identify (06h): Supported 00:07:34.135 Abort (08h): Supported 00:07:34.135 Set Features (09h): Supported 00:07:34.135 Get Features (0Ah): Supported 00:07:34.135 Asynchronous Event Request (0Ch): Supported 00:07:34.135 Namespace Attachment (15h): Supported NS-Inventory-Change 00:07:34.135 Directive Send (19h): Supported 00:07:34.135 Directive Receive (1Ah): Supported 00:07:34.135 Virtualization Management (1Ch): Supported 00:07:34.135 Doorbell Buffer Config (7Ch): Supported 00:07:34.135 Format NVM (80h): Supported LBA-Change 00:07:34.135 I/O Commands 00:07:34.135 ------------ 00:07:34.135 Flush (00h): Supported LBA-Change 00:07:34.135 Write (01h): Supported LBA-Change 00:07:34.135 Read (02h): Supported 00:07:34.135 Compare (05h): Supported 00:07:34.135 Write Zeroes (08h): Supported LBA-Change 00:07:34.135 Dataset Management (09h): Supported LBA-Change 00:07:34.135 Unknown (0Ch): Supported 00:07:34.135 Unknown (12h): Supported 00:07:34.135 Copy (19h): Supported LBA-Change 00:07:34.135 Unknown (1Dh): Supported LBA-Change 00:07:34.135 00:07:34.135 Error Log 00:07:34.135 ========= 00:07:34.135 00:07:34.135 Arbitration 00:07:34.135 =========== 00:07:34.135 Arbitration Burst: no limit 00:07:34.135 00:07:34.135 Power Management 00:07:34.135 ================ 00:07:34.135 Number of Power States: 1 00:07:34.135 Current Power State: Power State #0 00:07:34.135 Power State #0: 00:07:34.135 Max Power: 25.00 W 00:07:34.135 Non-Operational State: Operational 00:07:34.135 Entry Latency: 16 microseconds 00:07:34.135 Exit Latency: 4 microseconds 00:07:34.135 Relative Read Throughput: 0 00:07:34.135 Relative Read Latency: 0 00:07:34.135 Relative Write Throughput: 0 00:07:34.135 Relative Write Latency: 0 00:07:34.135 Idle Power: Not Reported 00:07:34.135 Active Power: Not Reported 00:07:34.135 Non-Operational Permissive Mode: Not Supported 00:07:34.135 00:07:34.135 Health Information 00:07:34.135 ================== 00:07:34.135 Critical Warnings: 00:07:34.135 Available Spare Space: OK 00:07:34.135 Temperature: OK 00:07:34.135 Device Reliability: OK 00:07:34.135 Read Only: No 00:07:34.135 Volatile Memory Backup: OK 00:07:34.135 Current Temperature: 323 Kelvin (50 Celsius) 00:07:34.135 Temperature Threshold: 343 Kelvin (70 Celsius) 00:07:34.135 Available Spare: 0% 00:07:34.135 Available Spare Threshold: 0% 00:07:34.135 Life Percentage Used: 0% 00:07:34.135 Data Units Read: 981 00:07:34.135 Data Units Written: 853 00:07:34.135 Host Read Commands: 50667 00:07:34.135 Host Write Commands: 49511 00:07:34.135 Controller Busy Time: 0 minutes 00:07:34.135 Power Cycles: 0 00:07:34.135 Power On Hours: 0 hours 00:07:34.135 Unsafe Shutdowns: 0 00:07:34.135 Unrecoverable Media Errors: 0 00:07:34.135 Lifetime Error Log Entries: 0 00:07:34.135 Warning Temperature Time: 0 minutes 00:07:34.135 Critical Temperature Time: 0 minutes 00:07:34.135 00:07:34.135 Number of Queues 00:07:34.135 ================ 00:07:34.135 Number of I/O Submission Queues: 64 00:07:34.135 Number of I/O Completion Queues: 64 00:07:34.135 00:07:34.135 ZNS Specific Controller Data 00:07:34.135 ============================ 00:07:34.135 Zone Append Size Limit: 0 00:07:34.135 00:07:34.135 00:07:34.135 Active Namespaces 00:07:34.135 ================= 00:07:34.135 Namespace ID:1 00:07:34.135 Error Recovery Timeout: Unlimited 00:07:34.135 Command Set Identifier: NVM (00h) 00:07:34.135 Deallocate: Supported 00:07:34.135 Deallocated/Unwritten Error: Supported 00:07:34.135 Deallocated Read Value: All 0x00 00:07:34.135 Deallocate in Write Zeroes: Not Supported 00:07:34.135 Deallocated Guard Field: 0xFFFF 00:07:34.135 Flush: Supported 00:07:34.135 Reservation: Not Supported 00:07:34.135 Namespace Sharing Capabilities: Private 00:07:34.135 Size (in LBAs): 1310720 (5GiB) 00:07:34.135 Capacity (in LBAs): 1310720 (5GiB) 00:07:34.135 Utilization (in LBAs): 1310720 (5GiB) 00:07:34.135 Thin Provisioning: Not Supported 00:07:34.135 Per-NS Atomic Units: No 00:07:34.135 Maximum Single Source Range Length: 128 00:07:34.135 Maximum Copy Length: 128 00:07:34.135 Maximum Source Range Count: 128 00:07:34.135 NGUID/EUI64 Never Reused: No 00:07:34.135 Namespace Write Protected: No 00:07:34.135 Number of LBA Formats: 8 00:07:34.135 Current LBA Format: LBA Format #04 00:07:34.135 LBA Format #00: Data Size: 512 Metadata Size: 0 00:07:34.135 LBA Format #01: Data Size: 512 Metadata Size: 8 00:07:34.135 LBA Format #02: Data Size: 512 Metadata Size: 16 00:07:34.135 LBA Format #03: Data Size: 512 Metadata Size: 64 00:07:34.135 LBA Format #04: Data Size: 4096 Metadata Size: 0 00:07:34.135 LBA Format #05: Data Size: 4096 Metadata Size: 8 00:07:34.135 LBA Format #06: Data Size: 4096 Metadata Size: 16 00:07:34.135 LBA Format #07: Data Size: 4096 Metadata Size: 64 00:07:34.135 00:07:34.135 NVM Specific Namespace Data 00:07:34.135 =========================== 00:07:34.135 Logical Block Storage Tag Mask: 0 00:07:34.135 Protection Information Capabilities: 00:07:34.135 16b Guard Protection Information Storage Tag Support: No 00:07:34.135 16b Guard Protection Information Storage Tag Mask: Any bit in LBSTM can be 0 00:07:34.135 Storage Tag Check Read Support: No 00:07:34.135 Extended LBA Format #00: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.136 Extended LBA Format #01: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.136 Extended LBA Format #02: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.136 Extended LBA Format #03: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.136 Extended LBA Format #04: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.136 Extended LBA Format #05: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.136 Extended LBA Format #06: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.136 Extended LBA Format #07: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.136 04:57:54 nvme.nvme_identify -- nvme/nvme.sh@15 -- # for bdf in "${bdfs[@]}" 00:07:34.136 04:57:54 nvme.nvme_identify -- nvme/nvme.sh@16 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_identify -r 'trtype:PCIe traddr:0000:00:12.0' -i 0 00:07:34.396 ===================================================== 00:07:34.396 NVMe Controller at 0000:00:12.0 [1b36:0010] 00:07:34.396 ===================================================== 00:07:34.396 Controller Capabilities/Features 00:07:34.396 ================================ 00:07:34.396 Vendor ID: 1b36 00:07:34.396 Subsystem Vendor ID: 1af4 00:07:34.396 Serial Number: 12342 00:07:34.396 Model Number: QEMU NVMe Ctrl 00:07:34.396 Firmware Version: 8.0.0 00:07:34.396 Recommended Arb Burst: 6 00:07:34.396 IEEE OUI Identifier: 00 54 52 00:07:34.396 Multi-path I/O 00:07:34.396 May have multiple subsystem ports: No 00:07:34.396 May have multiple controllers: No 00:07:34.396 Associated with SR-IOV VF: No 00:07:34.396 Max Data Transfer Size: 524288 00:07:34.396 Max Number of Namespaces: 256 00:07:34.396 Max Number of I/O Queues: 64 00:07:34.396 NVMe Specification Version (VS): 1.4 00:07:34.396 NVMe Specification Version (Identify): 1.4 00:07:34.396 Maximum Queue Entries: 2048 00:07:34.396 Contiguous Queues Required: Yes 00:07:34.396 Arbitration Mechanisms Supported 00:07:34.396 Weighted Round Robin: Not Supported 00:07:34.396 Vendor Specific: Not Supported 00:07:34.396 Reset Timeout: 7500 ms 00:07:34.396 Doorbell Stride: 4 bytes 00:07:34.396 NVM Subsystem Reset: Not Supported 00:07:34.396 Command Sets Supported 00:07:34.396 NVM Command Set: Supported 00:07:34.396 Boot Partition: Not Supported 00:07:34.396 Memory Page Size Minimum: 4096 bytes 00:07:34.396 Memory Page Size Maximum: 65536 bytes 00:07:34.396 Persistent Memory Region: Not Supported 00:07:34.396 Optional Asynchronous Events Supported 00:07:34.396 Namespace Attribute Notices: Supported 00:07:34.396 Firmware Activation Notices: Not Supported 00:07:34.396 ANA Change Notices: Not Supported 00:07:34.396 PLE Aggregate Log Change Notices: Not Supported 00:07:34.396 LBA Status Info Alert Notices: Not Supported 00:07:34.396 EGE Aggregate Log Change Notices: Not Supported 00:07:34.396 Normal NVM Subsystem Shutdown event: Not Supported 00:07:34.396 Zone Descriptor Change Notices: Not Supported 00:07:34.396 Discovery Log Change Notices: Not Supported 00:07:34.396 Controller Attributes 00:07:34.396 128-bit Host Identifier: Not Supported 00:07:34.396 Non-Operational Permissive Mode: Not Supported 00:07:34.396 NVM Sets: Not Supported 00:07:34.396 Read Recovery Levels: Not Supported 00:07:34.396 Endurance Groups: Not Supported 00:07:34.396 Predictable Latency Mode: Not Supported 00:07:34.396 Traffic Based Keep ALive: Not Supported 00:07:34.396 Namespace Granularity: Not Supported 00:07:34.396 SQ Associations: Not Supported 00:07:34.396 UUID List: Not Supported 00:07:34.396 Multi-Domain Subsystem: Not Supported 00:07:34.396 Fixed Capacity Management: Not Supported 00:07:34.396 Variable Capacity Management: Not Supported 00:07:34.396 Delete Endurance Group: Not Supported 00:07:34.396 Delete NVM Set: Not Supported 00:07:34.396 Extended LBA Formats Supported: Supported 00:07:34.396 Flexible Data Placement Supported: Not Supported 00:07:34.396 00:07:34.396 Controller Memory Buffer Support 00:07:34.396 ================================ 00:07:34.396 Supported: No 00:07:34.396 00:07:34.396 Persistent Memory Region Support 00:07:34.396 ================================ 00:07:34.396 Supported: No 00:07:34.396 00:07:34.396 Admin Command Set Attributes 00:07:34.396 ============================ 00:07:34.396 Security Send/Receive: Not Supported 00:07:34.396 Format NVM: Supported 00:07:34.396 Firmware Activate/Download: Not Supported 00:07:34.396 Namespace Management: Supported 00:07:34.396 Device Self-Test: Not Supported 00:07:34.396 Directives: Supported 00:07:34.396 NVMe-MI: Not Supported 00:07:34.396 Virtualization Management: Not Supported 00:07:34.396 Doorbell Buffer Config: Supported 00:07:34.396 Get LBA Status Capability: Not Supported 00:07:34.396 Command & Feature Lockdown Capability: Not Supported 00:07:34.396 Abort Command Limit: 4 00:07:34.396 Async Event Request Limit: 4 00:07:34.396 Number of Firmware Slots: N/A 00:07:34.396 Firmware Slot 1 Read-Only: N/A 00:07:34.396 Firmware Activation Without Reset: N/A 00:07:34.396 Multiple Update Detection Support: N/A 00:07:34.396 Firmware Update Granularity: No Information Provided 00:07:34.396 Per-Namespace SMART Log: Yes 00:07:34.396 Asymmetric Namespace Access Log Page: Not Supported 00:07:34.396 Subsystem NQN: nqn.2019-08.org.qemu:12342 00:07:34.396 Command Effects Log Page: Supported 00:07:34.396 Get Log Page Extended Data: Supported 00:07:34.396 Telemetry Log Pages: Not Supported 00:07:34.396 Persistent Event Log Pages: Not Supported 00:07:34.396 Supported Log Pages Log Page: May Support 00:07:34.396 Commands Supported & Effects Log Page: Not Supported 00:07:34.396 Feature Identifiers & Effects Log Page:May Support 00:07:34.396 NVMe-MI Commands & Effects Log Page: May Support 00:07:34.396 Data Area 4 for Telemetry Log: Not Supported 00:07:34.396 Error Log Page Entries Supported: 1 00:07:34.396 Keep Alive: Not Supported 00:07:34.396 00:07:34.396 NVM Command Set Attributes 00:07:34.396 ========================== 00:07:34.396 Submission Queue Entry Size 00:07:34.396 Max: 64 00:07:34.396 Min: 64 00:07:34.396 Completion Queue Entry Size 00:07:34.396 Max: 16 00:07:34.396 Min: 16 00:07:34.396 Number of Namespaces: 256 00:07:34.396 Compare Command: Supported 00:07:34.396 Write Uncorrectable Command: Not Supported 00:07:34.396 Dataset Management Command: Supported 00:07:34.396 Write Zeroes Command: Supported 00:07:34.396 Set Features Save Field: Supported 00:07:34.397 Reservations: Not Supported 00:07:34.397 Timestamp: Supported 00:07:34.397 Copy: Supported 00:07:34.397 Volatile Write Cache: Present 00:07:34.397 Atomic Write Unit (Normal): 1 00:07:34.397 Atomic Write Unit (PFail): 1 00:07:34.397 Atomic Compare & Write Unit: 1 00:07:34.397 Fused Compare & Write: Not Supported 00:07:34.397 Scatter-Gather List 00:07:34.397 SGL Command Set: Supported 00:07:34.397 SGL Keyed: Not Supported 00:07:34.397 SGL Bit Bucket Descriptor: Not Supported 00:07:34.397 SGL Metadata Pointer: Not Supported 00:07:34.397 Oversized SGL: Not Supported 00:07:34.397 SGL Metadata Address: Not Supported 00:07:34.397 SGL Offset: Not Supported 00:07:34.397 Transport SGL Data Block: Not Supported 00:07:34.397 Replay Protected Memory Block: Not Supported 00:07:34.397 00:07:34.397 Firmware Slot Information 00:07:34.397 ========================= 00:07:34.397 Active slot: 1 00:07:34.397 Slot 1 Firmware Revision: 1.0 00:07:34.397 00:07:34.397 00:07:34.397 Commands Supported and Effects 00:07:34.397 ============================== 00:07:34.397 Admin Commands 00:07:34.397 -------------- 00:07:34.397 Delete I/O Submission Queue (00h): Supported 00:07:34.397 Create I/O Submission Queue (01h): Supported 00:07:34.397 Get Log Page (02h): Supported 00:07:34.397 Delete I/O Completion Queue (04h): Supported 00:07:34.397 Create I/O Completion Queue (05h): Supported 00:07:34.397 Identify (06h): Supported 00:07:34.397 Abort (08h): Supported 00:07:34.397 Set Features (09h): Supported 00:07:34.397 Get Features (0Ah): Supported 00:07:34.397 Asynchronous Event Request (0Ch): Supported 00:07:34.397 Namespace Attachment (15h): Supported NS-Inventory-Change 00:07:34.397 Directive Send (19h): Supported 00:07:34.397 Directive Receive (1Ah): Supported 00:07:34.397 Virtualization Management (1Ch): Supported 00:07:34.397 Doorbell Buffer Config (7Ch): Supported 00:07:34.397 Format NVM (80h): Supported LBA-Change 00:07:34.397 I/O Commands 00:07:34.397 ------------ 00:07:34.397 Flush (00h): Supported LBA-Change 00:07:34.397 Write (01h): Supported LBA-Change 00:07:34.397 Read (02h): Supported 00:07:34.397 Compare (05h): Supported 00:07:34.397 Write Zeroes (08h): Supported LBA-Change 00:07:34.397 Dataset Management (09h): Supported LBA-Change 00:07:34.397 Unknown (0Ch): Supported 00:07:34.397 Unknown (12h): Supported 00:07:34.397 Copy (19h): Supported LBA-Change 00:07:34.397 Unknown (1Dh): Supported LBA-Change 00:07:34.397 00:07:34.397 Error Log 00:07:34.397 ========= 00:07:34.397 00:07:34.397 Arbitration 00:07:34.397 =========== 00:07:34.397 Arbitration Burst: no limit 00:07:34.397 00:07:34.397 Power Management 00:07:34.397 ================ 00:07:34.397 Number of Power States: 1 00:07:34.397 Current Power State: Power State #0 00:07:34.397 Power State #0: 00:07:34.397 Max Power: 25.00 W 00:07:34.397 Non-Operational State: Operational 00:07:34.397 Entry Latency: 16 microseconds 00:07:34.397 Exit Latency: 4 microseconds 00:07:34.397 Relative Read Throughput: 0 00:07:34.397 Relative Read Latency: 0 00:07:34.397 Relative Write Throughput: 0 00:07:34.397 Relative Write Latency: 0 00:07:34.397 Idle Power: Not Reported 00:07:34.397 Active Power: Not Reported 00:07:34.397 Non-Operational Permissive Mode: Not Supported 00:07:34.397 00:07:34.397 Health Information 00:07:34.397 ================== 00:07:34.397 Critical Warnings: 00:07:34.397 Available Spare Space: OK 00:07:34.397 Temperature: OK 00:07:34.397 Device Reliability: OK 00:07:34.397 Read Only: No 00:07:34.397 Volatile Memory Backup: OK 00:07:34.397 Current Temperature: 323 Kelvin (50 Celsius) 00:07:34.397 Temperature Threshold: 343 Kelvin (70 Celsius) 00:07:34.397 Available Spare: 0% 00:07:34.397 Available Spare Threshold: 0% 00:07:34.397 Life Percentage Used: 0% 00:07:34.397 Data Units Read: 2233 00:07:34.397 Data Units Written: 2020 00:07:34.397 Host Read Commands: 104082 00:07:34.397 Host Write Commands: 102353 00:07:34.397 Controller Busy Time: 0 minutes 00:07:34.397 Power Cycles: 0 00:07:34.397 Power On Hours: 0 hours 00:07:34.397 Unsafe Shutdowns: 0 00:07:34.397 Unrecoverable Media Errors: 0 00:07:34.397 Lifetime Error Log Entries: 0 00:07:34.397 Warning Temperature Time: 0 minutes 00:07:34.397 Critical Temperature Time: 0 minutes 00:07:34.397 00:07:34.397 Number of Queues 00:07:34.397 ================ 00:07:34.397 Number of I/O Submission Queues: 64 00:07:34.397 Number of I/O Completion Queues: 64 00:07:34.397 00:07:34.397 ZNS Specific Controller Data 00:07:34.397 ============================ 00:07:34.397 Zone Append Size Limit: 0 00:07:34.397 00:07:34.397 00:07:34.397 Active Namespaces 00:07:34.397 ================= 00:07:34.397 Namespace ID:1 00:07:34.397 Error Recovery Timeout: Unlimited 00:07:34.397 Command Set Identifier: NVM (00h) 00:07:34.397 Deallocate: Supported 00:07:34.397 Deallocated/Unwritten Error: Supported 00:07:34.397 Deallocated Read Value: All 0x00 00:07:34.397 Deallocate in Write Zeroes: Not Supported 00:07:34.397 Deallocated Guard Field: 0xFFFF 00:07:34.397 Flush: Supported 00:07:34.397 Reservation: Not Supported 00:07:34.397 Namespace Sharing Capabilities: Private 00:07:34.397 Size (in LBAs): 1048576 (4GiB) 00:07:34.397 Capacity (in LBAs): 1048576 (4GiB) 00:07:34.397 Utilization (in LBAs): 1048576 (4GiB) 00:07:34.397 Thin Provisioning: Not Supported 00:07:34.397 Per-NS Atomic Units: No 00:07:34.397 Maximum Single Source Range Length: 128 00:07:34.397 Maximum Copy Length: 128 00:07:34.397 Maximum Source Range Count: 128 00:07:34.397 NGUID/EUI64 Never Reused: No 00:07:34.397 Namespace Write Protected: No 00:07:34.397 Number of LBA Formats: 8 00:07:34.397 Current LBA Format: LBA Format #04 00:07:34.397 LBA Format #00: Data Size: 512 Metadata Size: 0 00:07:34.397 LBA Format #01: Data Size: 512 Metadata Size: 8 00:07:34.397 LBA Format #02: Data Size: 512 Metadata Size: 16 00:07:34.397 LBA Format #03: Data Size: 512 Metadata Size: 64 00:07:34.397 LBA Format #04: Data Size: 4096 Metadata Size: 0 00:07:34.397 LBA Format #05: Data Size: 4096 Metadata Size: 8 00:07:34.397 LBA Format #06: Data Size: 4096 Metadata Size: 16 00:07:34.397 LBA Format #07: Data Size: 4096 Metadata Size: 64 00:07:34.397 00:07:34.397 NVM Specific Namespace Data 00:07:34.397 =========================== 00:07:34.397 Logical Block Storage Tag Mask: 0 00:07:34.397 Protection Information Capabilities: 00:07:34.397 16b Guard Protection Information Storage Tag Support: No 00:07:34.397 16b Guard Protection Information Storage Tag Mask: Any bit in LBSTM can be 0 00:07:34.397 Storage Tag Check Read Support: No 00:07:34.397 Extended LBA Format #00: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.397 Extended LBA Format #01: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.397 Extended LBA Format #02: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.397 Extended LBA Format #03: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.397 Extended LBA Format #04: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.397 Extended LBA Format #05: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.397 Extended LBA Format #06: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.397 Extended LBA Format #07: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.397 Namespace ID:2 00:07:34.397 Error Recovery Timeout: Unlimited 00:07:34.397 Command Set Identifier: NVM (00h) 00:07:34.397 Deallocate: Supported 00:07:34.397 Deallocated/Unwritten Error: Supported 00:07:34.397 Deallocated Read Value: All 0x00 00:07:34.397 Deallocate in Write Zeroes: Not Supported 00:07:34.397 Deallocated Guard Field: 0xFFFF 00:07:34.397 Flush: Supported 00:07:34.397 Reservation: Not Supported 00:07:34.397 Namespace Sharing Capabilities: Private 00:07:34.397 Size (in LBAs): 1048576 (4GiB) 00:07:34.397 Capacity (in LBAs): 1048576 (4GiB) 00:07:34.397 Utilization (in LBAs): 1048576 (4GiB) 00:07:34.397 Thin Provisioning: Not Supported 00:07:34.397 Per-NS Atomic Units: No 00:07:34.397 Maximum Single Source Range Length: 128 00:07:34.397 Maximum Copy Length: 128 00:07:34.397 Maximum Source Range Count: 128 00:07:34.397 NGUID/EUI64 Never Reused: No 00:07:34.397 Namespace Write Protected: No 00:07:34.397 Number of LBA Formats: 8 00:07:34.397 Current LBA Format: LBA Format #04 00:07:34.397 LBA Format #00: Data Size: 512 Metadata Size: 0 00:07:34.397 LBA Format #01: Data Size: 512 Metadata Size: 8 00:07:34.397 LBA Format #02: Data Size: 512 Metadata Size: 16 00:07:34.397 LBA Format #03: Data Size: 512 Metadata Size: 64 00:07:34.397 LBA Format #04: Data Size: 4096 Metadata Size: 0 00:07:34.397 LBA Format #05: Data Size: 4096 Metadata Size: 8 00:07:34.397 LBA Format #06: Data Size: 4096 Metadata Size: 16 00:07:34.397 LBA Format #07: Data Size: 4096 Metadata Size: 64 00:07:34.397 00:07:34.397 NVM Specific Namespace Data 00:07:34.397 =========================== 00:07:34.397 Logical Block Storage Tag Mask: 0 00:07:34.397 Protection Information Capabilities: 00:07:34.397 16b Guard Protection Information Storage Tag Support: No 00:07:34.397 16b Guard Protection Information Storage Tag Mask: Any bit in LBSTM can be 0 00:07:34.397 Storage Tag Check Read Support: No 00:07:34.397 Extended LBA Format #00: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.397 Extended LBA Format #01: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 Extended LBA Format #02: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 Extended LBA Format #03: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 Extended LBA Format #04: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 Extended LBA Format #05: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 Extended LBA Format #06: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 Extended LBA Format #07: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 Namespace ID:3 00:07:34.398 Error Recovery Timeout: Unlimited 00:07:34.398 Command Set Identifier: NVM (00h) 00:07:34.398 Deallocate: Supported 00:07:34.398 Deallocated/Unwritten Error: Supported 00:07:34.398 Deallocated Read Value: All 0x00 00:07:34.398 Deallocate in Write Zeroes: Not Supported 00:07:34.398 Deallocated Guard Field: 0xFFFF 00:07:34.398 Flush: Supported 00:07:34.398 Reservation: Not Supported 00:07:34.398 Namespace Sharing Capabilities: Private 00:07:34.398 Size (in LBAs): 1048576 (4GiB) 00:07:34.398 Capacity (in LBAs): 1048576 (4GiB) 00:07:34.398 Utilization (in LBAs): 1048576 (4GiB) 00:07:34.398 Thin Provisioning: Not Supported 00:07:34.398 Per-NS Atomic Units: No 00:07:34.398 Maximum Single Source Range Length: 128 00:07:34.398 Maximum Copy Length: 128 00:07:34.398 Maximum Source Range Count: 128 00:07:34.398 NGUID/EUI64 Never Reused: No 00:07:34.398 Namespace Write Protected: No 00:07:34.398 Number of LBA Formats: 8 00:07:34.398 Current LBA Format: LBA Format #04 00:07:34.398 LBA Format #00: Data Size: 512 Metadata Size: 0 00:07:34.398 LBA Format #01: Data Size: 512 Metadata Size: 8 00:07:34.398 LBA Format #02: Data Size: 512 Metadata Size: 16 00:07:34.398 LBA Format #03: Data Size: 512 Metadata Size: 64 00:07:34.398 LBA Format #04: Data Size: 4096 Metadata Size: 0 00:07:34.398 LBA Format #05: Data Size: 4096 Metadata Size: 8 00:07:34.398 LBA Format #06: Data Size: 4096 Metadata Size: 16 00:07:34.398 LBA Format #07: Data Size: 4096 Metadata Size: 64 00:07:34.398 00:07:34.398 NVM Specific Namespace Data 00:07:34.398 =========================== 00:07:34.398 Logical Block Storage Tag Mask: 0 00:07:34.398 Protection Information Capabilities: 00:07:34.398 16b Guard Protection Information Storage Tag Support: No 00:07:34.398 16b Guard Protection Information Storage Tag Mask: Any bit in LBSTM can be 0 00:07:34.398 Storage Tag Check Read Support: No 00:07:34.398 Extended LBA Format #00: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 Extended LBA Format #01: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 Extended LBA Format #02: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 Extended LBA Format #03: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 Extended LBA Format #04: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 Extended LBA Format #05: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 Extended LBA Format #06: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 Extended LBA Format #07: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.398 04:57:54 nvme.nvme_identify -- nvme/nvme.sh@15 -- # for bdf in "${bdfs[@]}" 00:07:34.398 04:57:54 nvme.nvme_identify -- nvme/nvme.sh@16 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_identify -r 'trtype:PCIe traddr:0000:00:13.0' -i 0 00:07:34.398 ===================================================== 00:07:34.398 NVMe Controller at 0000:00:13.0 [1b36:0010] 00:07:34.398 ===================================================== 00:07:34.398 Controller Capabilities/Features 00:07:34.398 ================================ 00:07:34.398 Vendor ID: 1b36 00:07:34.398 Subsystem Vendor ID: 1af4 00:07:34.398 Serial Number: 12343 00:07:34.398 Model Number: QEMU NVMe Ctrl 00:07:34.398 Firmware Version: 8.0.0 00:07:34.398 Recommended Arb Burst: 6 00:07:34.398 IEEE OUI Identifier: 00 54 52 00:07:34.398 Multi-path I/O 00:07:34.398 May have multiple subsystem ports: No 00:07:34.398 May have multiple controllers: Yes 00:07:34.398 Associated with SR-IOV VF: No 00:07:34.398 Max Data Transfer Size: 524288 00:07:34.398 Max Number of Namespaces: 256 00:07:34.398 Max Number of I/O Queues: 64 00:07:34.398 NVMe Specification Version (VS): 1.4 00:07:34.398 NVMe Specification Version (Identify): 1.4 00:07:34.398 Maximum Queue Entries: 2048 00:07:34.398 Contiguous Queues Required: Yes 00:07:34.398 Arbitration Mechanisms Supported 00:07:34.398 Weighted Round Robin: Not Supported 00:07:34.398 Vendor Specific: Not Supported 00:07:34.398 Reset Timeout: 7500 ms 00:07:34.398 Doorbell Stride: 4 bytes 00:07:34.398 NVM Subsystem Reset: Not Supported 00:07:34.398 Command Sets Supported 00:07:34.398 NVM Command Set: Supported 00:07:34.398 Boot Partition: Not Supported 00:07:34.398 Memory Page Size Minimum: 4096 bytes 00:07:34.398 Memory Page Size Maximum: 65536 bytes 00:07:34.398 Persistent Memory Region: Not Supported 00:07:34.398 Optional Asynchronous Events Supported 00:07:34.398 Namespace Attribute Notices: Supported 00:07:34.398 Firmware Activation Notices: Not Supported 00:07:34.398 ANA Change Notices: Not Supported 00:07:34.398 PLE Aggregate Log Change Notices: Not Supported 00:07:34.398 LBA Status Info Alert Notices: Not Supported 00:07:34.398 EGE Aggregate Log Change Notices: Not Supported 00:07:34.398 Normal NVM Subsystem Shutdown event: Not Supported 00:07:34.398 Zone Descriptor Change Notices: Not Supported 00:07:34.398 Discovery Log Change Notices: Not Supported 00:07:34.398 Controller Attributes 00:07:34.398 128-bit Host Identifier: Not Supported 00:07:34.398 Non-Operational Permissive Mode: Not Supported 00:07:34.398 NVM Sets: Not Supported 00:07:34.398 Read Recovery Levels: Not Supported 00:07:34.398 Endurance Groups: Supported 00:07:34.398 Predictable Latency Mode: Not Supported 00:07:34.398 Traffic Based Keep ALive: Not Supported 00:07:34.398 Namespace Granularity: Not Supported 00:07:34.398 SQ Associations: Not Supported 00:07:34.398 UUID List: Not Supported 00:07:34.398 Multi-Domain Subsystem: Not Supported 00:07:34.398 Fixed Capacity Management: Not Supported 00:07:34.398 Variable Capacity Management: Not Supported 00:07:34.398 Delete Endurance Group: Not Supported 00:07:34.398 Delete NVM Set: Not Supported 00:07:34.398 Extended LBA Formats Supported: Supported 00:07:34.398 Flexible Data Placement Supported: Supported 00:07:34.398 00:07:34.398 Controller Memory Buffer Support 00:07:34.398 ================================ 00:07:34.398 Supported: No 00:07:34.398 00:07:34.398 Persistent Memory Region Support 00:07:34.398 ================================ 00:07:34.398 Supported: No 00:07:34.398 00:07:34.398 Admin Command Set Attributes 00:07:34.398 ============================ 00:07:34.398 Security Send/Receive: Not Supported 00:07:34.398 Format NVM: Supported 00:07:34.398 Firmware Activate/Download: Not Supported 00:07:34.398 Namespace Management: Supported 00:07:34.398 Device Self-Test: Not Supported 00:07:34.398 Directives: Supported 00:07:34.398 NVMe-MI: Not Supported 00:07:34.398 Virtualization Management: Not Supported 00:07:34.398 Doorbell Buffer Config: Supported 00:07:34.398 Get LBA Status Capability: Not Supported 00:07:34.398 Command & Feature Lockdown Capability: Not Supported 00:07:34.398 Abort Command Limit: 4 00:07:34.398 Async Event Request Limit: 4 00:07:34.398 Number of Firmware Slots: N/A 00:07:34.398 Firmware Slot 1 Read-Only: N/A 00:07:34.398 Firmware Activation Without Reset: N/A 00:07:34.398 Multiple Update Detection Support: N/A 00:07:34.398 Firmware Update Granularity: No Information Provided 00:07:34.398 Per-Namespace SMART Log: Yes 00:07:34.398 Asymmetric Namespace Access Log Page: Not Supported 00:07:34.398 Subsystem NQN: nqn.2019-08.org.qemu:fdp-subsys3 00:07:34.398 Command Effects Log Page: Supported 00:07:34.398 Get Log Page Extended Data: Supported 00:07:34.398 Telemetry Log Pages: Not Supported 00:07:34.398 Persistent Event Log Pages: Not Supported 00:07:34.398 Supported Log Pages Log Page: May Support 00:07:34.398 Commands Supported & Effects Log Page: Not Supported 00:07:34.398 Feature Identifiers & Effects Log Page:May Support 00:07:34.398 NVMe-MI Commands & Effects Log Page: May Support 00:07:34.398 Data Area 4 for Telemetry Log: Not Supported 00:07:34.398 Error Log Page Entries Supported: 1 00:07:34.398 Keep Alive: Not Supported 00:07:34.398 00:07:34.398 NVM Command Set Attributes 00:07:34.398 ========================== 00:07:34.398 Submission Queue Entry Size 00:07:34.398 Max: 64 00:07:34.398 Min: 64 00:07:34.398 Completion Queue Entry Size 00:07:34.398 Max: 16 00:07:34.398 Min: 16 00:07:34.398 Number of Namespaces: 256 00:07:34.398 Compare Command: Supported 00:07:34.398 Write Uncorrectable Command: Not Supported 00:07:34.398 Dataset Management Command: Supported 00:07:34.398 Write Zeroes Command: Supported 00:07:34.398 Set Features Save Field: Supported 00:07:34.398 Reservations: Not Supported 00:07:34.398 Timestamp: Supported 00:07:34.398 Copy: Supported 00:07:34.398 Volatile Write Cache: Present 00:07:34.398 Atomic Write Unit (Normal): 1 00:07:34.398 Atomic Write Unit (PFail): 1 00:07:34.398 Atomic Compare & Write Unit: 1 00:07:34.398 Fused Compare & Write: Not Supported 00:07:34.398 Scatter-Gather List 00:07:34.398 SGL Command Set: Supported 00:07:34.398 SGL Keyed: Not Supported 00:07:34.399 SGL Bit Bucket Descriptor: Not Supported 00:07:34.399 SGL Metadata Pointer: Not Supported 00:07:34.399 Oversized SGL: Not Supported 00:07:34.399 SGL Metadata Address: Not Supported 00:07:34.399 SGL Offset: Not Supported 00:07:34.399 Transport SGL Data Block: Not Supported 00:07:34.399 Replay Protected Memory Block: Not Supported 00:07:34.399 00:07:34.399 Firmware Slot Information 00:07:34.399 ========================= 00:07:34.399 Active slot: 1 00:07:34.399 Slot 1 Firmware Revision: 1.0 00:07:34.399 00:07:34.399 00:07:34.399 Commands Supported and Effects 00:07:34.399 ============================== 00:07:34.399 Admin Commands 00:07:34.399 -------------- 00:07:34.399 Delete I/O Submission Queue (00h): Supported 00:07:34.399 Create I/O Submission Queue (01h): Supported 00:07:34.399 Get Log Page (02h): Supported 00:07:34.399 Delete I/O Completion Queue (04h): Supported 00:07:34.399 Create I/O Completion Queue (05h): Supported 00:07:34.399 Identify (06h): Supported 00:07:34.399 Abort (08h): Supported 00:07:34.399 Set Features (09h): Supported 00:07:34.399 Get Features (0Ah): Supported 00:07:34.399 Asynchronous Event Request (0Ch): Supported 00:07:34.399 Namespace Attachment (15h): Supported NS-Inventory-Change 00:07:34.399 Directive Send (19h): Supported 00:07:34.399 Directive Receive (1Ah): Supported 00:07:34.399 Virtualization Management (1Ch): Supported 00:07:34.399 Doorbell Buffer Config (7Ch): Supported 00:07:34.399 Format NVM (80h): Supported LBA-Change 00:07:34.399 I/O Commands 00:07:34.399 ------------ 00:07:34.399 Flush (00h): Supported LBA-Change 00:07:34.399 Write (01h): Supported LBA-Change 00:07:34.399 Read (02h): Supported 00:07:34.399 Compare (05h): Supported 00:07:34.399 Write Zeroes (08h): Supported LBA-Change 00:07:34.399 Dataset Management (09h): Supported LBA-Change 00:07:34.399 Unknown (0Ch): Supported 00:07:34.399 Unknown (12h): Supported 00:07:34.399 Copy (19h): Supported LBA-Change 00:07:34.399 Unknown (1Dh): Supported LBA-Change 00:07:34.399 00:07:34.399 Error Log 00:07:34.399 ========= 00:07:34.399 00:07:34.399 Arbitration 00:07:34.399 =========== 00:07:34.399 Arbitration Burst: no limit 00:07:34.399 00:07:34.399 Power Management 00:07:34.399 ================ 00:07:34.399 Number of Power States: 1 00:07:34.399 Current Power State: Power State #0 00:07:34.399 Power State #0: 00:07:34.399 Max Power: 25.00 W 00:07:34.399 Non-Operational State: Operational 00:07:34.399 Entry Latency: 16 microseconds 00:07:34.399 Exit Latency: 4 microseconds 00:07:34.399 Relative Read Throughput: 0 00:07:34.399 Relative Read Latency: 0 00:07:34.399 Relative Write Throughput: 0 00:07:34.399 Relative Write Latency: 0 00:07:34.399 Idle Power: Not Reported 00:07:34.399 Active Power: Not Reported 00:07:34.399 Non-Operational Permissive Mode: Not Supported 00:07:34.399 00:07:34.399 Health Information 00:07:34.399 ================== 00:07:34.399 Critical Warnings: 00:07:34.399 Available Spare Space: OK 00:07:34.399 Temperature: OK 00:07:34.399 Device Reliability: OK 00:07:34.399 Read Only: No 00:07:34.399 Volatile Memory Backup: OK 00:07:34.399 Current Temperature: 323 Kelvin (50 Celsius) 00:07:34.399 Temperature Threshold: 343 Kelvin (70 Celsius) 00:07:34.399 Available Spare: 0% 00:07:34.399 Available Spare Threshold: 0% 00:07:34.399 Life Percentage Used: 0% 00:07:34.399 Data Units Read: 1032 00:07:34.399 Data Units Written: 961 00:07:34.399 Host Read Commands: 37123 00:07:34.399 Host Write Commands: 36546 00:07:34.399 Controller Busy Time: 0 minutes 00:07:34.399 Power Cycles: 0 00:07:34.399 Power On Hours: 0 hours 00:07:34.399 Unsafe Shutdowns: 0 00:07:34.399 Unrecoverable Media Errors: 0 00:07:34.399 Lifetime Error Log Entries: 0 00:07:34.399 Warning Temperature Time: 0 minutes 00:07:34.399 Critical Temperature Time: 0 minutes 00:07:34.399 00:07:34.399 Number of Queues 00:07:34.399 ================ 00:07:34.399 Number of I/O Submission Queues: 64 00:07:34.399 Number of I/O Completion Queues: 64 00:07:34.399 00:07:34.399 ZNS Specific Controller Data 00:07:34.399 ============================ 00:07:34.399 Zone Append Size Limit: 0 00:07:34.399 00:07:34.399 00:07:34.399 Active Namespaces 00:07:34.399 ================= 00:07:34.399 Namespace ID:1 00:07:34.399 Error Recovery Timeout: Unlimited 00:07:34.399 Command Set Identifier: NVM (00h) 00:07:34.399 Deallocate: Supported 00:07:34.399 Deallocated/Unwritten Error: Supported 00:07:34.399 Deallocated Read Value: All 0x00 00:07:34.399 Deallocate in Write Zeroes: Not Supported 00:07:34.399 Deallocated Guard Field: 0xFFFF 00:07:34.399 Flush: Supported 00:07:34.399 Reservation: Not Supported 00:07:34.399 Namespace Sharing Capabilities: Multiple Controllers 00:07:34.399 Size (in LBAs): 262144 (1GiB) 00:07:34.399 Capacity (in LBAs): 262144 (1GiB) 00:07:34.399 Utilization (in LBAs): 262144 (1GiB) 00:07:34.399 Thin Provisioning: Not Supported 00:07:34.399 Per-NS Atomic Units: No 00:07:34.399 Maximum Single Source Range Length: 128 00:07:34.399 Maximum Copy Length: 128 00:07:34.399 Maximum Source Range Count: 128 00:07:34.399 NGUID/EUI64 Never Reused: No 00:07:34.399 Namespace Write Protected: No 00:07:34.399 Endurance group ID: 1 00:07:34.399 Number of LBA Formats: 8 00:07:34.399 Current LBA Format: LBA Format #04 00:07:34.399 LBA Format #00: Data Size: 512 Metadata Size: 0 00:07:34.399 LBA Format #01: Data Size: 512 Metadata Size: 8 00:07:34.399 LBA Format #02: Data Size: 512 Metadata Size: 16 00:07:34.399 LBA Format #03: Data Size: 512 Metadata Size: 64 00:07:34.399 LBA Format #04: Data Size: 4096 Metadata Size: 0 00:07:34.399 LBA Format #05: Data Size: 4096 Metadata Size: 8 00:07:34.399 LBA Format #06: Data Size: 4096 Metadata Size: 16 00:07:34.399 LBA Format #07: Data Size: 4096 Metadata Size: 64 00:07:34.399 00:07:34.399 Get Feature FDP: 00:07:34.399 ================ 00:07:34.399 Enabled: Yes 00:07:34.399 FDP configuration index: 0 00:07:34.399 00:07:34.399 FDP configurations log page 00:07:34.399 =========================== 00:07:34.399 Number of FDP configurations: 1 00:07:34.399 Version: 0 00:07:34.399 Size: 112 00:07:34.399 FDP Configuration Descriptor: 0 00:07:34.399 Descriptor Size: 96 00:07:34.399 Reclaim Group Identifier format: 2 00:07:34.399 FDP Volatile Write Cache: Not Present 00:07:34.399 FDP Configuration: Valid 00:07:34.399 Vendor Specific Size: 0 00:07:34.399 Number of Reclaim Groups: 2 00:07:34.399 Number of Recalim Unit Handles: 8 00:07:34.399 Max Placement Identifiers: 128 00:07:34.399 Number of Namespaces Suppprted: 256 00:07:34.399 Reclaim unit Nominal Size: 6000000 bytes 00:07:34.399 Estimated Reclaim Unit Time Limit: Not Reported 00:07:34.399 RUH Desc #000: RUH Type: Initially Isolated 00:07:34.399 RUH Desc #001: RUH Type: Initially Isolated 00:07:34.399 RUH Desc #002: RUH Type: Initially Isolated 00:07:34.399 RUH Desc #003: RUH Type: Initially Isolated 00:07:34.399 RUH Desc #004: RUH Type: Initially Isolated 00:07:34.399 RUH Desc #005: RUH Type: Initially Isolated 00:07:34.399 RUH Desc #006: RUH Type: Initially Isolated 00:07:34.399 RUH Desc #007: RUH Type: Initially Isolated 00:07:34.399 00:07:34.399 FDP reclaim unit handle usage log page 00:07:34.399 ====================================== 00:07:34.399 Number of Reclaim Unit Handles: 8 00:07:34.399 RUH Usage Desc #000: RUH Attributes: Controller Specified 00:07:34.399 RUH Usage Desc #001: RUH Attributes: Unused 00:07:34.399 RUH Usage Desc #002: RUH Attributes: Unused 00:07:34.399 RUH Usage Desc #003: RUH Attributes: Unused 00:07:34.399 RUH Usage Desc #004: RUH Attributes: Unused 00:07:34.399 RUH Usage Desc #005: RUH Attributes: Unused 00:07:34.399 RUH Usage Desc #006: RUH Attributes: Unused 00:07:34.399 RUH Usage Desc #007: RUH Attributes: Unused 00:07:34.399 00:07:34.399 FDP statistics log page 00:07:34.399 ======================= 00:07:34.399 Host bytes with metadata written: 567779328 00:07:34.399 Media bytes with metadata written: 567857152 00:07:34.399 Media bytes erased: 0 00:07:34.399 00:07:34.399 FDP events log page 00:07:34.399 =================== 00:07:34.399 Number of FDP events: 0 00:07:34.399 00:07:34.399 NVM Specific Namespace Data 00:07:34.399 =========================== 00:07:34.399 Logical Block Storage Tag Mask: 0 00:07:34.399 Protection Information Capabilities: 00:07:34.399 16b Guard Protection Information Storage Tag Support: No 00:07:34.399 16b Guard Protection Information Storage Tag Mask: Any bit in LBSTM can be 0 00:07:34.399 Storage Tag Check Read Support: No 00:07:34.399 Extended LBA Format #00: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.399 Extended LBA Format #01: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.399 Extended LBA Format #02: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.399 Extended LBA Format #03: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.399 Extended LBA Format #04: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.399 Extended LBA Format #05: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.399 Extended LBA Format #06: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.399 Extended LBA Format #07: Storage Tag Size: 0 , Protection Information Format: 16b Guard PI 00:07:34.399 00:07:34.399 real 0m1.079s 00:07:34.399 user 0m0.377s 00:07:34.400 sys 0m0.509s 00:07:34.400 04:57:54 nvme.nvme_identify -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:34.400 04:57:54 nvme.nvme_identify -- common/autotest_common.sh@10 -- # set +x 00:07:34.400 ************************************ 00:07:34.400 END TEST nvme_identify 00:07:34.400 ************************************ 00:07:34.658 04:57:54 nvme -- nvme/nvme.sh@86 -- # run_test nvme_perf nvme_perf 00:07:34.658 04:57:54 nvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:34.658 04:57:54 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:34.658 04:57:54 nvme -- common/autotest_common.sh@10 -- # set +x 00:07:34.658 ************************************ 00:07:34.658 START TEST nvme_perf 00:07:34.658 ************************************ 00:07:34.658 04:57:54 nvme.nvme_perf -- common/autotest_common.sh@1129 -- # nvme_perf 00:07:34.658 04:57:54 nvme.nvme_perf -- nvme/nvme.sh@22 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_perf -q 128 -w read -o 12288 -t 1 -LL -i 0 -N 00:07:35.591 Initializing NVMe Controllers 00:07:35.591 Attached to NVMe Controller at 0000:00:10.0 [1b36:0010] 00:07:35.591 Attached to NVMe Controller at 0000:00:11.0 [1b36:0010] 00:07:35.591 Attached to NVMe Controller at 0000:00:13.0 [1b36:0010] 00:07:35.591 Attached to NVMe Controller at 0000:00:12.0 [1b36:0010] 00:07:35.591 Associating PCIE (0000:00:10.0) NSID 1 with lcore 0 00:07:35.591 Associating PCIE (0000:00:11.0) NSID 1 with lcore 0 00:07:35.591 Associating PCIE (0000:00:13.0) NSID 1 with lcore 0 00:07:35.591 Associating PCIE (0000:00:12.0) NSID 1 with lcore 0 00:07:35.591 Associating PCIE (0000:00:12.0) NSID 2 with lcore 0 00:07:35.591 Associating PCIE (0000:00:12.0) NSID 3 with lcore 0 00:07:35.591 Initialization complete. Launching workers. 00:07:35.591 ======================================================== 00:07:35.591 Latency(us) 00:07:35.591 Device Information : IOPS MiB/s Average min max 00:07:35.591 PCIE (0000:00:10.0) NSID 1 from core 0: 17613.50 206.41 7267.34 4711.91 30867.50 00:07:35.591 PCIE (0000:00:11.0) NSID 1 from core 0: 17613.50 206.41 7261.52 4549.40 30583.10 00:07:35.591 PCIE (0000:00:13.0) NSID 1 from core 0: 17613.50 206.41 7254.49 4031.80 30593.71 00:07:35.591 PCIE (0000:00:12.0) NSID 1 from core 0: 17613.50 206.41 7247.33 3850.41 30250.54 00:07:35.591 PCIE (0000:00:12.0) NSID 2 from core 0: 17613.50 206.41 7239.96 3641.05 29463.48 00:07:35.591 PCIE (0000:00:12.0) NSID 3 from core 0: 17613.50 206.41 7232.18 3431.00 29012.09 00:07:35.591 ======================================================== 00:07:35.591 Total : 105681.01 1238.45 7250.47 3431.00 30867.50 00:07:35.591 00:07:35.591 Summary latency data for PCIE (0000:00:10.0) NSID 1 from core 0: 00:07:35.591 ================================================================================= 00:07:35.591 1.00000% : 6099.889us 00:07:35.591 10.00000% : 6326.745us 00:07:35.591 25.00000% : 6553.600us 00:07:35.591 50.00000% : 6856.074us 00:07:35.591 75.00000% : 7208.960us 00:07:35.591 90.00000% : 7763.495us 00:07:35.591 95.00000% : 10435.348us 00:07:35.591 98.00000% : 13107.200us 00:07:35.591 99.00000% : 14518.745us 00:07:35.591 99.50000% : 21273.994us 00:07:35.591 99.90000% : 30045.735us 00:07:35.591 99.99000% : 30852.332us 00:07:35.591 99.99900% : 31053.982us 00:07:35.591 99.99990% : 31053.982us 00:07:35.591 99.99999% : 31053.982us 00:07:35.591 00:07:35.591 Summary latency data for PCIE (0000:00:11.0) NSID 1 from core 0: 00:07:35.591 ================================================================================= 00:07:35.591 1.00000% : 6200.714us 00:07:35.591 10.00000% : 6402.363us 00:07:35.591 25.00000% : 6604.012us 00:07:35.591 50.00000% : 6856.074us 00:07:35.591 75.00000% : 7158.548us 00:07:35.591 90.00000% : 7763.495us 00:07:35.591 95.00000% : 10636.997us 00:07:35.591 98.00000% : 12502.252us 00:07:35.591 99.00000% : 15123.692us 00:07:35.591 99.50000% : 21374.818us 00:07:35.591 99.90000% : 30247.385us 00:07:35.591 99.99000% : 30650.683us 00:07:35.591 99.99900% : 30650.683us 00:07:35.591 99.99990% : 30650.683us 00:07:35.591 99.99999% : 30650.683us 00:07:35.591 00:07:35.591 Summary latency data for PCIE (0000:00:13.0) NSID 1 from core 0: 00:07:35.591 ================================================================================= 00:07:35.591 1.00000% : 6175.508us 00:07:35.591 10.00000% : 6377.157us 00:07:35.591 25.00000% : 6604.012us 00:07:35.591 50.00000% : 6856.074us 00:07:35.591 75.00000% : 7158.548us 00:07:35.591 90.00000% : 7813.908us 00:07:35.591 95.00000% : 10737.822us 00:07:35.591 98.00000% : 12754.314us 00:07:35.591 99.00000% : 15325.342us 00:07:35.591 99.50000% : 21374.818us 00:07:35.591 99.90000% : 30247.385us 00:07:35.591 99.99000% : 30650.683us 00:07:35.591 99.99900% : 30650.683us 00:07:35.591 99.99990% : 30650.683us 00:07:35.591 99.99999% : 30650.683us 00:07:35.591 00:07:35.591 Summary latency data for PCIE (0000:00:12.0) NSID 1 from core 0: 00:07:35.591 ================================================================================= 00:07:35.591 1.00000% : 6150.302us 00:07:35.591 10.00000% : 6402.363us 00:07:35.591 25.00000% : 6604.012us 00:07:35.591 50.00000% : 6856.074us 00:07:35.591 75.00000% : 7158.548us 00:07:35.591 90.00000% : 7662.671us 00:07:35.591 95.00000% : 10636.997us 00:07:35.591 98.00000% : 12905.551us 00:07:35.591 99.00000% : 15426.166us 00:07:35.591 99.50000% : 21374.818us 00:07:35.591 99.90000% : 30045.735us 00:07:35.591 99.99000% : 30247.385us 00:07:35.591 99.99900% : 30449.034us 00:07:35.591 99.99990% : 30449.034us 00:07:35.591 99.99999% : 30449.034us 00:07:35.591 00:07:35.591 Summary latency data for PCIE (0000:00:12.0) NSID 2 from core 0: 00:07:35.591 ================================================================================= 00:07:35.591 1.00000% : 6150.302us 00:07:35.591 10.00000% : 6377.157us 00:07:35.591 25.00000% : 6604.012us 00:07:35.591 50.00000% : 6856.074us 00:07:35.591 75.00000% : 7158.548us 00:07:35.591 90.00000% : 7662.671us 00:07:35.591 95.00000% : 10334.523us 00:07:35.591 98.00000% : 12905.551us 00:07:35.591 99.00000% : 14619.569us 00:07:35.591 99.50000% : 21374.818us 00:07:35.591 99.90000% : 29239.138us 00:07:35.591 99.99000% : 29642.437us 00:07:35.591 99.99900% : 29642.437us 00:07:35.591 99.99990% : 29642.437us 00:07:35.591 99.99999% : 29642.437us 00:07:35.591 00:07:35.591 Summary latency data for PCIE (0000:00:12.0) NSID 3 from core 0: 00:07:35.591 ================================================================================= 00:07:35.591 1.00000% : 6150.302us 00:07:35.591 10.00000% : 6402.363us 00:07:35.591 25.00000% : 6604.012us 00:07:35.591 50.00000% : 6856.074us 00:07:35.591 75.00000% : 7158.548us 00:07:35.591 90.00000% : 7662.671us 00:07:35.591 95.00000% : 10183.286us 00:07:35.591 98.00000% : 12855.138us 00:07:35.591 99.00000% : 14216.271us 00:07:35.591 99.50000% : 21273.994us 00:07:35.591 99.90000% : 28835.840us 00:07:35.591 99.99000% : 29037.489us 00:07:35.591 99.99900% : 29037.489us 00:07:35.591 99.99990% : 29037.489us 00:07:35.591 99.99999% : 29037.489us 00:07:35.591 00:07:35.591 Latency histogram for PCIE (0000:00:10.0) NSID 1 from core 0: 00:07:35.591 ============================================================================== 00:07:35.591 Range in us Cumulative IO count 00:07:35.591 4688.345 - 4713.551: 0.0057% ( 1) 00:07:35.591 4713.551 - 4738.757: 0.0226% ( 3) 00:07:35.591 4738.757 - 4763.963: 0.0340% ( 2) 00:07:35.591 4763.963 - 4789.169: 0.0453% ( 2) 00:07:35.591 4814.375 - 4839.582: 0.0679% ( 4) 00:07:35.591 4864.788 - 4889.994: 0.0849% ( 3) 00:07:35.591 4889.994 - 4915.200: 0.0962% ( 2) 00:07:35.591 4915.200 - 4940.406: 0.1076% ( 2) 00:07:35.591 4940.406 - 4965.612: 0.1132% ( 1) 00:07:35.591 4965.612 - 4990.818: 0.1302% ( 3) 00:07:35.591 4990.818 - 5016.025: 0.1359% ( 1) 00:07:35.591 5016.025 - 5041.231: 0.1472% ( 2) 00:07:35.591 5041.231 - 5066.437: 0.1585% ( 2) 00:07:35.591 5066.437 - 5091.643: 0.1698% ( 2) 00:07:35.591 5091.643 - 5116.849: 0.1812% ( 2) 00:07:35.591 5116.849 - 5142.055: 0.1925% ( 2) 00:07:35.591 5142.055 - 5167.262: 0.2038% ( 2) 00:07:35.591 5167.262 - 5192.468: 0.2151% ( 2) 00:07:35.591 5192.468 - 5217.674: 0.2208% ( 1) 00:07:35.591 5217.674 - 5242.880: 0.2321% ( 2) 00:07:35.591 5242.880 - 5268.086: 0.2491% ( 3) 00:07:35.591 5268.086 - 5293.292: 0.2548% ( 1) 00:07:35.591 5293.292 - 5318.498: 0.2661% ( 2) 00:07:35.591 5318.498 - 5343.705: 0.2774% ( 2) 00:07:35.591 5343.705 - 5368.911: 0.2831% ( 1) 00:07:35.591 5368.911 - 5394.117: 0.2944% ( 2) 00:07:35.591 5394.117 - 5419.323: 0.3057% ( 2) 00:07:35.591 5419.323 - 5444.529: 0.3170% ( 2) 00:07:35.591 5444.529 - 5469.735: 0.3284% ( 2) 00:07:35.591 5469.735 - 5494.942: 0.3340% ( 1) 00:07:35.591 5494.942 - 5520.148: 0.3567% ( 4) 00:07:35.591 5520.148 - 5545.354: 0.3623% ( 1) 00:07:35.591 5973.858 - 5999.065: 0.3906% ( 5) 00:07:35.591 5999.065 - 6024.271: 0.4529% ( 11) 00:07:35.591 6024.271 - 6049.477: 0.6058% ( 27) 00:07:35.591 6049.477 - 6074.683: 0.7982% ( 34) 00:07:35.591 6074.683 - 6099.889: 1.0360% ( 42) 00:07:35.591 6099.889 - 6125.095: 1.5285% ( 87) 00:07:35.591 6125.095 - 6150.302: 2.0890% ( 99) 00:07:35.591 6150.302 - 6175.508: 2.8929% ( 142) 00:07:35.591 6175.508 - 6200.714: 3.7704% ( 155) 00:07:35.591 6200.714 - 6225.920: 4.6932% ( 163) 00:07:35.591 6225.920 - 6251.126: 5.8990% ( 213) 00:07:35.591 6251.126 - 6276.332: 7.2634% ( 241) 00:07:35.591 6276.332 - 6301.538: 8.8768% ( 285) 00:07:35.591 6301.538 - 6326.745: 10.6488% ( 313) 00:07:35.591 6326.745 - 6351.951: 12.4264% ( 314) 00:07:35.591 6351.951 - 6377.157: 14.1701% ( 308) 00:07:35.591 6377.157 - 6402.363: 15.8062% ( 289) 00:07:35.591 6402.363 - 6427.569: 17.6744% ( 330) 00:07:35.591 6427.569 - 6452.775: 19.3558% ( 297) 00:07:35.591 6452.775 - 6503.188: 22.9167% ( 629) 00:07:35.591 6503.188 - 6553.600: 26.6927% ( 667) 00:07:35.591 6553.600 - 6604.012: 30.5084% ( 674) 00:07:35.591 6604.012 - 6654.425: 34.4769% ( 701) 00:07:35.591 6654.425 - 6704.837: 38.3039% ( 676) 00:07:35.591 6704.837 - 6755.249: 42.2837% ( 703) 00:07:35.591 6755.249 - 6805.662: 46.2240% ( 696) 00:07:35.591 6805.662 - 6856.074: 50.3057% ( 721) 00:07:35.591 6856.074 - 6906.486: 54.2233% ( 692) 00:07:35.591 6906.486 - 6956.898: 58.3163% ( 723) 00:07:35.591 6956.898 - 7007.311: 62.3132% ( 706) 00:07:35.591 7007.311 - 7057.723: 66.3779% ( 718) 00:07:35.591 7057.723 - 7108.135: 70.4314% ( 716) 00:07:35.591 7108.135 - 7158.548: 74.2810% ( 680) 00:07:35.591 7158.548 - 7208.960: 77.5815% ( 583) 00:07:35.591 7208.960 - 7259.372: 80.5197% ( 519) 00:07:35.591 7259.372 - 7309.785: 83.0276% ( 443) 00:07:35.591 7309.785 - 7360.197: 84.9921% ( 347) 00:07:35.591 7360.197 - 7410.609: 86.3338% ( 237) 00:07:35.591 7410.609 - 7461.022: 87.2962% ( 170) 00:07:35.591 7461.022 - 7511.434: 87.9529% ( 116) 00:07:35.591 7511.434 - 7561.846: 88.5926% ( 113) 00:07:35.591 7561.846 - 7612.258: 89.1191% ( 93) 00:07:35.591 7612.258 - 7662.671: 89.4644% ( 61) 00:07:35.591 7662.671 - 7713.083: 89.7871% ( 57) 00:07:35.591 7713.083 - 7763.495: 90.1042% ( 56) 00:07:35.591 7763.495 - 7813.908: 90.3250% ( 39) 00:07:35.592 7813.908 - 7864.320: 90.5231% ( 35) 00:07:35.592 7864.320 - 7914.732: 90.6929% ( 30) 00:07:35.592 7914.732 - 7965.145: 90.8684% ( 31) 00:07:35.592 7965.145 - 8015.557: 90.9647% ( 17) 00:07:35.592 8015.557 - 8065.969: 91.0609% ( 17) 00:07:35.592 8065.969 - 8116.382: 91.1458% ( 15) 00:07:35.592 8116.382 - 8166.794: 91.2194% ( 13) 00:07:35.592 8166.794 - 8217.206: 91.2987% ( 14) 00:07:35.592 8217.206 - 8267.618: 91.3949% ( 17) 00:07:35.592 8267.618 - 8318.031: 91.4515% ( 10) 00:07:35.592 8318.031 - 8368.443: 91.4798% ( 5) 00:07:35.592 8368.443 - 8418.855: 91.5025% ( 4) 00:07:35.592 8418.855 - 8469.268: 91.5251% ( 4) 00:07:35.592 8469.268 - 8519.680: 91.5591% ( 6) 00:07:35.592 8519.680 - 8570.092: 91.6157% ( 10) 00:07:35.592 8570.092 - 8620.505: 91.6950% ( 14) 00:07:35.592 8620.505 - 8670.917: 91.7686% ( 13) 00:07:35.592 8670.917 - 8721.329: 91.8535% ( 15) 00:07:35.592 8721.329 - 8771.742: 91.9214% ( 12) 00:07:35.592 8771.742 - 8822.154: 92.0346% ( 20) 00:07:35.592 8822.154 - 8872.566: 92.0743% ( 7) 00:07:35.592 8872.566 - 8922.978: 92.1932% ( 21) 00:07:35.592 8922.978 - 8973.391: 92.3007% ( 19) 00:07:35.592 8973.391 - 9023.803: 92.3970% ( 17) 00:07:35.592 9023.803 - 9074.215: 92.5102% ( 20) 00:07:35.592 9074.215 - 9124.628: 92.6291% ( 21) 00:07:35.592 9124.628 - 9175.040: 92.7480% ( 21) 00:07:35.592 9175.040 - 9225.452: 92.8499% ( 18) 00:07:35.592 9225.452 - 9275.865: 92.9574% ( 19) 00:07:35.592 9275.865 - 9326.277: 93.0933% ( 24) 00:07:35.592 9326.277 - 9376.689: 93.2235% ( 23) 00:07:35.592 9376.689 - 9427.102: 93.3707% ( 26) 00:07:35.592 9427.102 - 9477.514: 93.4669% ( 17) 00:07:35.592 9477.514 - 9527.926: 93.5745% ( 19) 00:07:35.592 9527.926 - 9578.338: 93.7047% ( 23) 00:07:35.592 9578.338 - 9628.751: 93.8293% ( 22) 00:07:35.592 9628.751 - 9679.163: 93.9312% ( 18) 00:07:35.592 9679.163 - 9729.575: 94.0500% ( 21) 00:07:35.592 9729.575 - 9779.988: 94.1350% ( 15) 00:07:35.592 9779.988 - 9830.400: 94.1972% ( 11) 00:07:35.592 9830.400 - 9880.812: 94.2822% ( 15) 00:07:35.592 9880.812 - 9931.225: 94.3784% ( 17) 00:07:35.592 9931.225 - 9981.637: 94.4577% ( 14) 00:07:35.592 9981.637 - 10032.049: 94.5143% ( 10) 00:07:35.592 10032.049 - 10082.462: 94.5935% ( 14) 00:07:35.592 10082.462 - 10132.874: 94.6445% ( 9) 00:07:35.592 10132.874 - 10183.286: 94.7181% ( 13) 00:07:35.592 10183.286 - 10233.698: 94.7577% ( 7) 00:07:35.592 10233.698 - 10284.111: 94.8200% ( 11) 00:07:35.592 10284.111 - 10334.523: 94.8653% ( 8) 00:07:35.592 10334.523 - 10384.935: 94.9558% ( 16) 00:07:35.592 10384.935 - 10435.348: 95.0804% ( 22) 00:07:35.592 10435.348 - 10485.760: 95.1596% ( 14) 00:07:35.592 10485.760 - 10536.172: 95.2502% ( 16) 00:07:35.592 10536.172 - 10586.585: 95.3408% ( 16) 00:07:35.592 10586.585 - 10636.997: 95.4144% ( 13) 00:07:35.592 10636.997 - 10687.409: 95.4654% ( 9) 00:07:35.592 10687.409 - 10737.822: 95.5389% ( 13) 00:07:35.592 10737.822 - 10788.234: 95.6012% ( 11) 00:07:35.592 10788.234 - 10838.646: 95.6692% ( 12) 00:07:35.592 10838.646 - 10889.058: 95.7201% ( 9) 00:07:35.592 10889.058 - 10939.471: 95.7597% ( 7) 00:07:35.592 10939.471 - 10989.883: 95.8163% ( 10) 00:07:35.592 10989.883 - 11040.295: 95.8673% ( 9) 00:07:35.592 11040.295 - 11090.708: 95.9183% ( 9) 00:07:35.592 11090.708 - 11141.120: 95.9579% ( 7) 00:07:35.592 11141.120 - 11191.532: 95.9975% ( 7) 00:07:35.592 11191.532 - 11241.945: 96.0654% ( 12) 00:07:35.592 11241.945 - 11292.357: 96.1107% ( 8) 00:07:35.592 11292.357 - 11342.769: 96.1673% ( 10) 00:07:35.592 11342.769 - 11393.182: 96.2296% ( 11) 00:07:35.592 11393.182 - 11443.594: 96.2919% ( 11) 00:07:35.592 11443.594 - 11494.006: 96.3542% ( 11) 00:07:35.592 11494.006 - 11544.418: 96.3995% ( 8) 00:07:35.592 11544.418 - 11594.831: 96.4504% ( 9) 00:07:35.592 11594.831 - 11645.243: 96.5014% ( 9) 00:07:35.592 11645.243 - 11695.655: 96.5580% ( 10) 00:07:35.592 11695.655 - 11746.068: 96.6033% ( 8) 00:07:35.592 11746.068 - 11796.480: 96.6655% ( 11) 00:07:35.592 11796.480 - 11846.892: 96.7165% ( 9) 00:07:35.592 11846.892 - 11897.305: 96.7618% ( 8) 00:07:35.592 11897.305 - 11947.717: 96.8354% ( 13) 00:07:35.592 11947.717 - 11998.129: 96.8976% ( 11) 00:07:35.592 11998.129 - 12048.542: 96.9769% ( 14) 00:07:35.592 12048.542 - 12098.954: 97.0448% ( 12) 00:07:35.592 12098.954 - 12149.366: 97.1014% ( 10) 00:07:35.592 12149.366 - 12199.778: 97.1750% ( 13) 00:07:35.592 12199.778 - 12250.191: 97.2543% ( 14) 00:07:35.592 12250.191 - 12300.603: 97.3222% ( 12) 00:07:35.592 12300.603 - 12351.015: 97.4015% ( 14) 00:07:35.592 12351.015 - 12401.428: 97.4751% ( 13) 00:07:35.592 12401.428 - 12451.840: 97.5600% ( 15) 00:07:35.592 12451.840 - 12502.252: 97.6110% ( 9) 00:07:35.592 12502.252 - 12552.665: 97.6562% ( 8) 00:07:35.592 12552.665 - 12603.077: 97.6902% ( 6) 00:07:35.592 12603.077 - 12653.489: 97.7412% ( 9) 00:07:35.592 12653.489 - 12703.902: 97.7695% ( 5) 00:07:35.592 12703.902 - 12754.314: 97.8148% ( 8) 00:07:35.592 12754.314 - 12804.726: 97.8374% ( 4) 00:07:35.592 12804.726 - 12855.138: 97.8657% ( 5) 00:07:35.592 12855.138 - 12905.551: 97.9110% ( 8) 00:07:35.592 12905.551 - 13006.375: 97.9789% ( 12) 00:07:35.592 13006.375 - 13107.200: 98.0299% ( 9) 00:07:35.592 13107.200 - 13208.025: 98.0865% ( 10) 00:07:35.592 13208.025 - 13308.849: 98.1488% ( 11) 00:07:35.592 13308.849 - 13409.674: 98.2167% ( 12) 00:07:35.592 13409.674 - 13510.498: 98.2677% ( 9) 00:07:35.592 13510.498 - 13611.323: 98.3639% ( 17) 00:07:35.592 13611.323 - 13712.148: 98.4601% ( 17) 00:07:35.592 13712.148 - 13812.972: 98.5337% ( 13) 00:07:35.592 13812.972 - 13913.797: 98.5960% ( 11) 00:07:35.592 13913.797 - 14014.622: 98.6809% ( 15) 00:07:35.592 14014.622 - 14115.446: 98.7715% ( 16) 00:07:35.592 14115.446 - 14216.271: 98.8394% ( 12) 00:07:35.592 14216.271 - 14317.095: 98.9130% ( 13) 00:07:35.592 14317.095 - 14417.920: 98.9923% ( 14) 00:07:35.592 14417.920 - 14518.745: 99.0263% ( 6) 00:07:35.592 14518.745 - 14619.569: 99.0602% ( 6) 00:07:35.592 14619.569 - 14720.394: 99.0942% ( 6) 00:07:35.592 14720.394 - 14821.218: 99.1225% ( 5) 00:07:35.592 14821.218 - 14922.043: 99.1508% ( 5) 00:07:35.592 14922.043 - 15022.868: 99.1735% ( 4) 00:07:35.592 15022.868 - 15123.692: 99.1904% ( 3) 00:07:35.592 15123.692 - 15224.517: 99.2074% ( 3) 00:07:35.592 15224.517 - 15325.342: 99.2244% ( 3) 00:07:35.592 15325.342 - 15426.166: 99.2414% ( 3) 00:07:35.592 15426.166 - 15526.991: 99.2640% ( 4) 00:07:35.592 15526.991 - 15627.815: 99.2754% ( 2) 00:07:35.592 19761.625 - 19862.449: 99.2810% ( 1) 00:07:35.592 19862.449 - 19963.274: 99.2980% ( 3) 00:07:35.592 19963.274 - 20064.098: 99.3093% ( 2) 00:07:35.592 20064.098 - 20164.923: 99.3320% ( 4) 00:07:35.592 20164.923 - 20265.748: 99.3490% ( 3) 00:07:35.592 20265.748 - 20366.572: 99.3603% ( 2) 00:07:35.592 20366.572 - 20467.397: 99.3829% ( 4) 00:07:35.592 20467.397 - 20568.222: 99.3942% ( 2) 00:07:35.592 20568.222 - 20669.046: 99.4112% ( 3) 00:07:35.592 20669.046 - 20769.871: 99.4282% ( 3) 00:07:35.592 20769.871 - 20870.695: 99.4452% ( 3) 00:07:35.592 20870.695 - 20971.520: 99.4622% ( 3) 00:07:35.592 20971.520 - 21072.345: 99.4792% ( 3) 00:07:35.592 21072.345 - 21173.169: 99.4962% ( 3) 00:07:35.592 21173.169 - 21273.994: 99.5188% ( 4) 00:07:35.592 21273.994 - 21374.818: 99.5301% ( 2) 00:07:35.592 21374.818 - 21475.643: 99.5471% ( 3) 00:07:35.592 21475.643 - 21576.468: 99.5641% ( 3) 00:07:35.592 21576.468 - 21677.292: 99.5811% ( 3) 00:07:35.592 21677.292 - 21778.117: 99.5981% ( 3) 00:07:35.592 21778.117 - 21878.942: 99.6150% ( 3) 00:07:35.592 21878.942 - 21979.766: 99.6377% ( 4) 00:07:35.592 28432.542 - 28634.191: 99.6660% ( 5) 00:07:35.592 28634.191 - 28835.840: 99.7000% ( 6) 00:07:35.592 28835.840 - 29037.489: 99.7339% ( 6) 00:07:35.592 29037.489 - 29239.138: 99.7679% ( 6) 00:07:35.592 29239.138 - 29440.788: 99.8019% ( 6) 00:07:35.592 29440.788 - 29642.437: 99.8358% ( 6) 00:07:35.592 29642.437 - 29844.086: 99.8755% ( 7) 00:07:35.592 29844.086 - 30045.735: 99.9094% ( 6) 00:07:35.592 30045.735 - 30247.385: 99.9434% ( 6) 00:07:35.592 30247.385 - 30449.034: 99.9604% ( 3) 00:07:35.592 30650.683 - 30852.332: 99.9943% ( 6) 00:07:35.592 30852.332 - 31053.982: 100.0000% ( 1) 00:07:35.592 00:07:35.592 Latency histogram for PCIE (0000:00:11.0) NSID 1 from core 0: 00:07:35.853 ============================================================================== 00:07:35.853 Range in us Cumulative IO count 00:07:35.853 4537.108 - 4562.314: 0.0226% ( 4) 00:07:35.853 4562.314 - 4587.520: 0.0566% ( 6) 00:07:35.853 4587.520 - 4612.726: 0.0679% ( 2) 00:07:35.853 4637.932 - 4663.138: 0.0793% ( 2) 00:07:35.853 4663.138 - 4688.345: 0.0906% ( 2) 00:07:35.853 4688.345 - 4713.551: 0.1019% ( 2) 00:07:35.853 4713.551 - 4738.757: 0.1132% ( 2) 00:07:35.853 4738.757 - 4763.963: 0.1189% ( 1) 00:07:35.853 4763.963 - 4789.169: 0.1245% ( 1) 00:07:35.853 4789.169 - 4814.375: 0.1359% ( 2) 00:07:35.853 4814.375 - 4839.582: 0.1529% ( 3) 00:07:35.853 4839.582 - 4864.788: 0.1642% ( 2) 00:07:35.853 4864.788 - 4889.994: 0.1755% ( 2) 00:07:35.853 4889.994 - 4915.200: 0.1868% ( 2) 00:07:35.853 4915.200 - 4940.406: 0.1981% ( 2) 00:07:35.853 4940.406 - 4965.612: 0.2095% ( 2) 00:07:35.853 4965.612 - 4990.818: 0.2208% ( 2) 00:07:35.853 4990.818 - 5016.025: 0.2378% ( 3) 00:07:35.853 5016.025 - 5041.231: 0.2491% ( 2) 00:07:35.853 5041.231 - 5066.437: 0.2604% ( 2) 00:07:35.853 5066.437 - 5091.643: 0.2717% ( 2) 00:07:35.853 5091.643 - 5116.849: 0.2887% ( 3) 00:07:35.853 5116.849 - 5142.055: 0.3000% ( 2) 00:07:35.853 5142.055 - 5167.262: 0.3114% ( 2) 00:07:35.853 5167.262 - 5192.468: 0.3227% ( 2) 00:07:35.853 5192.468 - 5217.674: 0.3340% ( 2) 00:07:35.853 5217.674 - 5242.880: 0.3510% ( 3) 00:07:35.853 5242.880 - 5268.086: 0.3567% ( 1) 00:07:35.853 5268.086 - 5293.292: 0.3623% ( 1) 00:07:35.853 6049.477 - 6074.683: 0.3680% ( 1) 00:07:35.853 6074.683 - 6099.889: 0.4189% ( 9) 00:07:35.853 6099.889 - 6125.095: 0.4925% ( 13) 00:07:35.853 6125.095 - 6150.302: 0.6397% ( 26) 00:07:35.853 6150.302 - 6175.508: 0.9737% ( 59) 00:07:35.853 6175.508 - 6200.714: 1.3191% ( 61) 00:07:35.853 6200.714 - 6225.920: 1.8739% ( 98) 00:07:35.853 6225.920 - 6251.126: 2.7797% ( 160) 00:07:35.853 6251.126 - 6276.332: 3.9346% ( 204) 00:07:35.853 6276.332 - 6301.538: 5.1404% ( 213) 00:07:35.853 6301.538 - 6326.745: 6.6067% ( 259) 00:07:35.853 6326.745 - 6351.951: 8.0389% ( 253) 00:07:35.853 6351.951 - 6377.157: 9.7430% ( 301) 00:07:35.853 6377.157 - 6402.363: 11.4866% ( 308) 00:07:35.853 6402.363 - 6427.569: 13.5983% ( 373) 00:07:35.853 6427.569 - 6452.775: 15.7495% ( 380) 00:07:35.853 6452.775 - 6503.188: 20.2332% ( 792) 00:07:35.853 6503.188 - 6553.600: 24.3716% ( 731) 00:07:35.853 6553.600 - 6604.012: 28.6798% ( 761) 00:07:35.853 6604.012 - 6654.425: 32.9257% ( 750) 00:07:35.853 6654.425 - 6704.837: 37.4660% ( 802) 00:07:35.853 6704.837 - 6755.249: 41.8139% ( 768) 00:07:35.853 6755.249 - 6805.662: 46.5297% ( 833) 00:07:35.853 6805.662 - 6856.074: 51.2455% ( 833) 00:07:35.853 6856.074 - 6906.486: 55.9669% ( 834) 00:07:35.853 6906.486 - 6956.898: 60.6091% ( 820) 00:07:35.853 6956.898 - 7007.311: 65.2287% ( 816) 00:07:35.853 7007.311 - 7057.723: 69.6445% ( 780) 00:07:35.853 7057.723 - 7108.135: 73.6243% ( 703) 00:07:35.853 7108.135 - 7158.548: 77.3154% ( 652) 00:07:35.853 7158.548 - 7208.960: 80.4688% ( 557) 00:07:35.853 7208.960 - 7259.372: 82.9257% ( 434) 00:07:35.853 7259.372 - 7309.785: 84.7203% ( 317) 00:07:35.853 7309.785 - 7360.197: 85.9658% ( 220) 00:07:35.853 7360.197 - 7410.609: 86.8603% ( 158) 00:07:35.853 7410.609 - 7461.022: 87.6076% ( 132) 00:07:35.853 7461.022 - 7511.434: 88.2529% ( 114) 00:07:35.853 7511.434 - 7561.846: 88.7625% ( 90) 00:07:35.853 7561.846 - 7612.258: 89.1870% ( 75) 00:07:35.853 7612.258 - 7662.671: 89.5607% ( 66) 00:07:35.853 7662.671 - 7713.083: 89.8890% ( 58) 00:07:35.854 7713.083 - 7763.495: 90.1268% ( 42) 00:07:35.854 7763.495 - 7813.908: 90.3533% ( 40) 00:07:35.854 7813.908 - 7864.320: 90.5288% ( 31) 00:07:35.854 7864.320 - 7914.732: 90.7156% ( 33) 00:07:35.854 7914.732 - 7965.145: 90.8854% ( 30) 00:07:35.854 7965.145 - 8015.557: 91.0836% ( 35) 00:07:35.854 8015.557 - 8065.969: 91.2308% ( 26) 00:07:35.854 8065.969 - 8116.382: 91.3723% ( 25) 00:07:35.854 8116.382 - 8166.794: 91.4798% ( 19) 00:07:35.854 8166.794 - 8217.206: 91.5931% ( 20) 00:07:35.854 8217.206 - 8267.618: 91.6893% ( 17) 00:07:35.854 8267.618 - 8318.031: 91.7572% ( 12) 00:07:35.854 8318.031 - 8368.443: 91.8308% ( 13) 00:07:35.854 8368.443 - 8418.855: 91.8875% ( 10) 00:07:35.854 8418.855 - 8469.268: 91.9611% ( 13) 00:07:35.854 8469.268 - 8519.680: 92.0460% ( 15) 00:07:35.854 8519.680 - 8570.092: 92.1365% ( 16) 00:07:35.854 8570.092 - 8620.505: 92.2045% ( 12) 00:07:35.854 8620.505 - 8670.917: 92.2951% ( 16) 00:07:35.854 8670.917 - 8721.329: 92.3800% ( 15) 00:07:35.854 8721.329 - 8771.742: 92.4706% ( 16) 00:07:35.854 8771.742 - 8822.154: 92.5894% ( 21) 00:07:35.854 8822.154 - 8872.566: 92.7027% ( 20) 00:07:35.854 8872.566 - 8922.978: 92.8046% ( 18) 00:07:35.854 8922.978 - 8973.391: 92.8725% ( 12) 00:07:35.854 8973.391 - 9023.803: 92.9518% ( 14) 00:07:35.854 9023.803 - 9074.215: 93.0310% ( 14) 00:07:35.854 9074.215 - 9124.628: 93.1273% ( 17) 00:07:35.854 9124.628 - 9175.040: 93.2235% ( 17) 00:07:35.854 9175.040 - 9225.452: 93.3084% ( 15) 00:07:35.854 9225.452 - 9275.865: 93.3877% ( 14) 00:07:35.854 9275.865 - 9326.277: 93.4783% ( 16) 00:07:35.854 9326.277 - 9376.689: 93.5462% ( 12) 00:07:35.854 9376.689 - 9427.102: 93.6311% ( 15) 00:07:35.854 9427.102 - 9477.514: 93.6990% ( 12) 00:07:35.854 9477.514 - 9527.926: 93.7783% ( 14) 00:07:35.854 9527.926 - 9578.338: 93.8293% ( 9) 00:07:35.854 9578.338 - 9628.751: 93.8915% ( 11) 00:07:35.854 9628.751 - 9679.163: 93.9595% ( 12) 00:07:35.854 9679.163 - 9729.575: 94.0500% ( 16) 00:07:35.854 9729.575 - 9779.988: 94.1236% ( 13) 00:07:35.854 9779.988 - 9830.400: 94.1803% ( 10) 00:07:35.854 9830.400 - 9880.812: 94.2538% ( 13) 00:07:35.854 9880.812 - 9931.225: 94.3218% ( 12) 00:07:35.854 9931.225 - 9981.637: 94.3954% ( 13) 00:07:35.854 9981.637 - 10032.049: 94.4577% ( 11) 00:07:35.854 10032.049 - 10082.462: 94.4973% ( 7) 00:07:35.854 10082.462 - 10132.874: 94.5369% ( 7) 00:07:35.854 10132.874 - 10183.286: 94.5765% ( 7) 00:07:35.854 10183.286 - 10233.698: 94.6275% ( 9) 00:07:35.854 10233.698 - 10284.111: 94.6841% ( 10) 00:07:35.854 10284.111 - 10334.523: 94.7407% ( 10) 00:07:35.854 10334.523 - 10384.935: 94.7917% ( 9) 00:07:35.854 10384.935 - 10435.348: 94.8370% ( 8) 00:07:35.854 10435.348 - 10485.760: 94.8879% ( 9) 00:07:35.854 10485.760 - 10536.172: 94.9389% ( 9) 00:07:35.854 10536.172 - 10586.585: 94.9841% ( 8) 00:07:35.854 10586.585 - 10636.997: 95.0351% ( 9) 00:07:35.854 10636.997 - 10687.409: 95.0804% ( 8) 00:07:35.854 10687.409 - 10737.822: 95.1370% ( 10) 00:07:35.854 10737.822 - 10788.234: 95.1823% ( 8) 00:07:35.854 10788.234 - 10838.646: 95.2332% ( 9) 00:07:35.854 10838.646 - 10889.058: 95.2672% ( 6) 00:07:35.854 10889.058 - 10939.471: 95.3408% ( 13) 00:07:35.854 10939.471 - 10989.883: 95.4484% ( 19) 00:07:35.854 10989.883 - 11040.295: 95.5163% ( 12) 00:07:35.854 11040.295 - 11090.708: 95.6069% ( 16) 00:07:35.854 11090.708 - 11141.120: 95.6861% ( 14) 00:07:35.854 11141.120 - 11191.532: 95.7767% ( 16) 00:07:35.854 11191.532 - 11241.945: 95.8730% ( 17) 00:07:35.854 11241.945 - 11292.357: 95.9579% ( 15) 00:07:35.854 11292.357 - 11342.769: 96.0485% ( 16) 00:07:35.854 11342.769 - 11393.182: 96.1221% ( 13) 00:07:35.854 11393.182 - 11443.594: 96.2070% ( 15) 00:07:35.854 11443.594 - 11494.006: 96.2919% ( 15) 00:07:35.854 11494.006 - 11544.418: 96.3768% ( 15) 00:07:35.854 11544.418 - 11594.831: 96.4731% ( 17) 00:07:35.854 11594.831 - 11645.243: 96.5693% ( 17) 00:07:35.854 11645.243 - 11695.655: 96.6938% ( 22) 00:07:35.854 11695.655 - 11746.068: 96.8014% ( 19) 00:07:35.854 11746.068 - 11796.480: 96.9260% ( 22) 00:07:35.854 11796.480 - 11846.892: 97.0392% ( 20) 00:07:35.854 11846.892 - 11897.305: 97.1354% ( 17) 00:07:35.854 11897.305 - 11947.717: 97.2147% ( 14) 00:07:35.854 11947.717 - 11998.129: 97.3109% ( 17) 00:07:35.854 11998.129 - 12048.542: 97.4128% ( 18) 00:07:35.854 12048.542 - 12098.954: 97.4751% ( 11) 00:07:35.854 12098.954 - 12149.366: 97.5487% ( 13) 00:07:35.854 12149.366 - 12199.778: 97.6223% ( 13) 00:07:35.854 12199.778 - 12250.191: 97.6732% ( 9) 00:07:35.854 12250.191 - 12300.603: 97.7298% ( 10) 00:07:35.854 12300.603 - 12351.015: 97.8261% ( 17) 00:07:35.854 12351.015 - 12401.428: 97.9053% ( 14) 00:07:35.854 12401.428 - 12451.840: 97.9676% ( 11) 00:07:35.854 12451.840 - 12502.252: 98.0242% ( 10) 00:07:35.854 12502.252 - 12552.665: 98.0865% ( 11) 00:07:35.854 12552.665 - 12603.077: 98.1318% ( 8) 00:07:35.854 12603.077 - 12653.489: 98.1771% ( 8) 00:07:35.854 12653.489 - 12703.902: 98.2111% ( 6) 00:07:35.854 12703.902 - 12754.314: 98.2507% ( 7) 00:07:35.854 12754.314 - 12804.726: 98.2620% ( 2) 00:07:35.854 12804.726 - 12855.138: 98.2733% ( 2) 00:07:35.854 12855.138 - 12905.551: 98.2960% ( 4) 00:07:35.854 12905.551 - 13006.375: 98.3469% ( 9) 00:07:35.854 13006.375 - 13107.200: 98.3979% ( 9) 00:07:35.854 13107.200 - 13208.025: 98.4828% ( 15) 00:07:35.854 13208.025 - 13308.849: 98.5394% ( 10) 00:07:35.854 13308.849 - 13409.674: 98.5960% ( 10) 00:07:35.854 13409.674 - 13510.498: 98.6526% ( 10) 00:07:35.854 13510.498 - 13611.323: 98.7036% ( 9) 00:07:35.854 13611.323 - 13712.148: 98.7545% ( 9) 00:07:35.854 13712.148 - 13812.972: 98.8111% ( 10) 00:07:35.854 13812.972 - 13913.797: 98.8508% ( 7) 00:07:35.854 13913.797 - 14014.622: 98.8847% ( 6) 00:07:35.854 14014.622 - 14115.446: 98.9130% ( 5) 00:07:35.854 14619.569 - 14720.394: 98.9300% ( 3) 00:07:35.854 14720.394 - 14821.218: 98.9527% ( 4) 00:07:35.854 14821.218 - 14922.043: 98.9697% ( 3) 00:07:35.854 14922.043 - 15022.868: 98.9866% ( 3) 00:07:35.854 15022.868 - 15123.692: 99.0093% ( 4) 00:07:35.854 15123.692 - 15224.517: 99.0263% ( 3) 00:07:35.854 15224.517 - 15325.342: 99.0546% ( 5) 00:07:35.854 15325.342 - 15426.166: 99.0716% ( 3) 00:07:35.854 15426.166 - 15526.991: 99.0885% ( 3) 00:07:35.854 15526.991 - 15627.815: 99.0999% ( 2) 00:07:35.854 15627.815 - 15728.640: 99.1168% ( 3) 00:07:35.854 15728.640 - 15829.465: 99.1338% ( 3) 00:07:35.854 15829.465 - 15930.289: 99.1565% ( 4) 00:07:35.854 15930.289 - 16031.114: 99.1791% ( 4) 00:07:35.854 16031.114 - 16131.938: 99.1961% ( 3) 00:07:35.854 16131.938 - 16232.763: 99.2188% ( 4) 00:07:35.854 16232.763 - 16333.588: 99.2584% ( 7) 00:07:35.854 16333.588 - 16434.412: 99.2754% ( 3) 00:07:35.854 20164.923 - 20265.748: 99.2923% ( 3) 00:07:35.854 20265.748 - 20366.572: 99.3093% ( 3) 00:07:35.854 20366.572 - 20467.397: 99.3320% ( 4) 00:07:35.854 20467.397 - 20568.222: 99.3490% ( 3) 00:07:35.854 20568.222 - 20669.046: 99.3716% ( 4) 00:07:35.854 20669.046 - 20769.871: 99.3886% ( 3) 00:07:35.854 20769.871 - 20870.695: 99.4112% ( 4) 00:07:35.854 20870.695 - 20971.520: 99.4282% ( 3) 00:07:35.854 20971.520 - 21072.345: 99.4509% ( 4) 00:07:35.854 21072.345 - 21173.169: 99.4735% ( 4) 00:07:35.854 21173.169 - 21273.994: 99.4905% ( 3) 00:07:35.854 21273.994 - 21374.818: 99.5075% ( 3) 00:07:35.854 21374.818 - 21475.643: 99.5301% ( 4) 00:07:35.854 21475.643 - 21576.468: 99.5471% ( 3) 00:07:35.854 21576.468 - 21677.292: 99.5697% ( 4) 00:07:35.854 21677.292 - 21778.117: 99.5924% ( 4) 00:07:35.854 21778.117 - 21878.942: 99.6094% ( 3) 00:07:35.854 21878.942 - 21979.766: 99.6264% ( 3) 00:07:35.854 21979.766 - 22080.591: 99.6377% ( 2) 00:07:35.854 28634.191 - 28835.840: 99.6603% ( 4) 00:07:35.854 28835.840 - 29037.489: 99.6943% ( 6) 00:07:35.854 29037.489 - 29239.138: 99.7339% ( 7) 00:07:35.854 29239.138 - 29440.788: 99.7736% ( 7) 00:07:35.854 29440.788 - 29642.437: 99.8132% ( 7) 00:07:35.854 29642.437 - 29844.086: 99.8528% ( 7) 00:07:35.854 29844.086 - 30045.735: 99.8981% ( 8) 00:07:35.854 30045.735 - 30247.385: 99.9377% ( 7) 00:07:35.854 30247.385 - 30449.034: 99.9774% ( 7) 00:07:35.854 30449.034 - 30650.683: 100.0000% ( 4) 00:07:35.854 00:07:35.854 Latency histogram for PCIE (0000:00:13.0) NSID 1 from core 0: 00:07:35.854 ============================================================================== 00:07:35.854 Range in us Cumulative IO count 00:07:35.854 4007.778 - 4032.985: 0.0057% ( 1) 00:07:35.854 4032.985 - 4058.191: 0.0340% ( 5) 00:07:35.854 4058.191 - 4083.397: 0.0510% ( 3) 00:07:35.854 4108.603 - 4133.809: 0.0679% ( 3) 00:07:35.854 4133.809 - 4159.015: 0.0793% ( 2) 00:07:35.854 4159.015 - 4184.222: 0.0906% ( 2) 00:07:35.854 4184.222 - 4209.428: 0.1019% ( 2) 00:07:35.854 4209.428 - 4234.634: 0.1132% ( 2) 00:07:35.854 4234.634 - 4259.840: 0.1245% ( 2) 00:07:35.854 4259.840 - 4285.046: 0.1415% ( 3) 00:07:35.854 4285.046 - 4310.252: 0.1529% ( 2) 00:07:35.854 4310.252 - 4335.458: 0.1642% ( 2) 00:07:35.854 4335.458 - 4360.665: 0.1755% ( 2) 00:07:35.854 4360.665 - 4385.871: 0.1868% ( 2) 00:07:35.854 4385.871 - 4411.077: 0.2038% ( 3) 00:07:35.854 4411.077 - 4436.283: 0.2151% ( 2) 00:07:35.854 4436.283 - 4461.489: 0.2264% ( 2) 00:07:35.854 4461.489 - 4486.695: 0.2378% ( 2) 00:07:35.854 4486.695 - 4511.902: 0.2491% ( 2) 00:07:35.854 4511.902 - 4537.108: 0.2604% ( 2) 00:07:35.854 4537.108 - 4562.314: 0.2717% ( 2) 00:07:35.854 4562.314 - 4587.520: 0.2831% ( 2) 00:07:35.854 4587.520 - 4612.726: 0.2887% ( 1) 00:07:35.854 4612.726 - 4637.932: 0.3057% ( 3) 00:07:35.854 4637.932 - 4663.138: 0.3170% ( 2) 00:07:35.854 4663.138 - 4688.345: 0.3284% ( 2) 00:07:35.854 4688.345 - 4713.551: 0.3397% ( 2) 00:07:35.855 4713.551 - 4738.757: 0.3510% ( 2) 00:07:35.855 4738.757 - 4763.963: 0.3623% ( 2) 00:07:35.855 5570.560 - 5595.766: 0.3736% ( 2) 00:07:35.855 5595.766 - 5620.972: 0.3963% ( 4) 00:07:35.855 5620.972 - 5646.178: 0.4076% ( 2) 00:07:35.855 5646.178 - 5671.385: 0.4133% ( 1) 00:07:35.855 5671.385 - 5696.591: 0.4246% ( 2) 00:07:35.855 5696.591 - 5721.797: 0.4416% ( 3) 00:07:35.855 5721.797 - 5747.003: 0.4529% ( 2) 00:07:35.855 5747.003 - 5772.209: 0.4642% ( 2) 00:07:35.855 5772.209 - 5797.415: 0.4755% ( 2) 00:07:35.855 5797.415 - 5822.622: 0.4869% ( 2) 00:07:35.855 5822.622 - 5847.828: 0.5038% ( 3) 00:07:35.855 5847.828 - 5873.034: 0.5152% ( 2) 00:07:35.855 5873.034 - 5898.240: 0.5265% ( 2) 00:07:35.855 5898.240 - 5923.446: 0.5378% ( 2) 00:07:35.855 5923.446 - 5948.652: 0.5491% ( 2) 00:07:35.855 5948.652 - 5973.858: 0.5661% ( 3) 00:07:35.855 5973.858 - 5999.065: 0.5774% ( 2) 00:07:35.855 5999.065 - 6024.271: 0.5888% ( 2) 00:07:35.855 6024.271 - 6049.477: 0.6001% ( 2) 00:07:35.855 6049.477 - 6074.683: 0.6284% ( 5) 00:07:35.855 6074.683 - 6099.889: 0.6850% ( 10) 00:07:35.855 6099.889 - 6125.095: 0.7869% ( 18) 00:07:35.855 6125.095 - 6150.302: 0.9851% ( 35) 00:07:35.855 6150.302 - 6175.508: 1.2115% ( 40) 00:07:35.855 6175.508 - 6200.714: 1.6021% ( 69) 00:07:35.855 6200.714 - 6225.920: 2.1796% ( 102) 00:07:35.855 6225.920 - 6251.126: 3.0005% ( 145) 00:07:35.855 6251.126 - 6276.332: 4.2120% ( 214) 00:07:35.855 6276.332 - 6301.538: 5.4857% ( 225) 00:07:35.855 6301.538 - 6326.745: 6.9067% ( 251) 00:07:35.855 6326.745 - 6351.951: 8.4862% ( 279) 00:07:35.855 6351.951 - 6377.157: 10.2921% ( 319) 00:07:35.855 6377.157 - 6402.363: 12.2339% ( 343) 00:07:35.855 6402.363 - 6427.569: 14.2606% ( 358) 00:07:35.855 6427.569 - 6452.775: 16.3779% ( 374) 00:07:35.855 6452.775 - 6503.188: 20.5616% ( 739) 00:07:35.855 6503.188 - 6553.600: 24.9151% ( 769) 00:07:35.855 6553.600 - 6604.012: 29.2403% ( 764) 00:07:35.855 6604.012 - 6654.425: 33.6447% ( 778) 00:07:35.855 6654.425 - 6704.837: 38.1341% ( 793) 00:07:35.855 6704.837 - 6755.249: 42.7649% ( 818) 00:07:35.855 6755.249 - 6805.662: 47.2486% ( 792) 00:07:35.855 6805.662 - 6856.074: 51.7776% ( 800) 00:07:35.855 6856.074 - 6906.486: 56.3010% ( 799) 00:07:35.855 6906.486 - 6956.898: 60.9771% ( 826) 00:07:35.855 6956.898 - 7007.311: 65.6363% ( 823) 00:07:35.855 7007.311 - 7057.723: 69.8879% ( 751) 00:07:35.855 7057.723 - 7108.135: 73.9130% ( 711) 00:07:35.855 7108.135 - 7158.548: 77.4457% ( 624) 00:07:35.855 7158.548 - 7208.960: 80.3782% ( 518) 00:07:35.855 7208.960 - 7259.372: 82.6880% ( 408) 00:07:35.855 7259.372 - 7309.785: 84.4033% ( 303) 00:07:35.855 7309.785 - 7360.197: 85.7054% ( 230) 00:07:35.855 7360.197 - 7410.609: 86.6395% ( 165) 00:07:35.855 7410.609 - 7461.022: 87.3755% ( 130) 00:07:35.855 7461.022 - 7511.434: 87.9529% ( 102) 00:07:35.855 7511.434 - 7561.846: 88.4567% ( 89) 00:07:35.855 7561.846 - 7612.258: 88.8700% ( 73) 00:07:35.855 7612.258 - 7662.671: 89.2889% ( 74) 00:07:35.855 7662.671 - 7713.083: 89.6399% ( 62) 00:07:35.855 7713.083 - 7763.495: 89.9457% ( 54) 00:07:35.855 7763.495 - 7813.908: 90.2287% ( 50) 00:07:35.855 7813.908 - 7864.320: 90.5061% ( 49) 00:07:35.855 7864.320 - 7914.732: 90.7043% ( 35) 00:07:35.855 7914.732 - 7965.145: 90.9024% ( 35) 00:07:35.855 7965.145 - 8015.557: 91.1005% ( 35) 00:07:35.855 8015.557 - 8065.969: 91.3043% ( 36) 00:07:35.855 8065.969 - 8116.382: 91.4685% ( 29) 00:07:35.855 8116.382 - 8166.794: 91.6384% ( 30) 00:07:35.855 8166.794 - 8217.206: 91.7969% ( 28) 00:07:35.855 8217.206 - 8267.618: 92.0007% ( 36) 00:07:35.855 8267.618 - 8318.031: 92.1535% ( 27) 00:07:35.855 8318.031 - 8368.443: 92.2611% ( 19) 00:07:35.855 8368.443 - 8418.855: 92.3347% ( 13) 00:07:35.855 8418.855 - 8469.268: 92.4253% ( 16) 00:07:35.855 8469.268 - 8519.680: 92.5215% ( 17) 00:07:35.855 8519.680 - 8570.092: 92.6291% ( 19) 00:07:35.855 8570.092 - 8620.505: 92.7310% ( 18) 00:07:35.855 8620.505 - 8670.917: 92.8385% ( 19) 00:07:35.855 8670.917 - 8721.329: 92.9291% ( 16) 00:07:35.855 8721.329 - 8771.742: 93.0310% ( 18) 00:07:35.855 8771.742 - 8822.154: 93.1159% ( 15) 00:07:35.855 8822.154 - 8872.566: 93.2065% ( 16) 00:07:35.855 8872.566 - 8922.978: 93.3028% ( 17) 00:07:35.855 8922.978 - 8973.391: 93.3933% ( 16) 00:07:35.855 8973.391 - 9023.803: 93.4726% ( 14) 00:07:35.855 9023.803 - 9074.215: 93.5122% ( 7) 00:07:35.855 9074.215 - 9124.628: 93.5575% ( 8) 00:07:35.855 9124.628 - 9175.040: 93.5915% ( 6) 00:07:35.855 9175.040 - 9225.452: 93.6085% ( 3) 00:07:35.855 9225.452 - 9275.865: 93.6198% ( 2) 00:07:35.855 9275.865 - 9326.277: 93.6311% ( 2) 00:07:35.855 9326.277 - 9376.689: 93.6481% ( 3) 00:07:35.855 9376.689 - 9427.102: 93.6594% ( 2) 00:07:35.855 9427.102 - 9477.514: 93.6764% ( 3) 00:07:35.855 9477.514 - 9527.926: 93.6877% ( 2) 00:07:35.855 9527.926 - 9578.338: 93.6990% ( 2) 00:07:35.855 9578.338 - 9628.751: 93.7104% ( 2) 00:07:35.855 9628.751 - 9679.163: 93.7274% ( 3) 00:07:35.855 9679.163 - 9729.575: 93.7387% ( 2) 00:07:35.855 9729.575 - 9779.988: 93.7557% ( 3) 00:07:35.855 9779.988 - 9830.400: 93.8010% ( 8) 00:07:35.855 9830.400 - 9880.812: 93.8632% ( 11) 00:07:35.855 9880.812 - 9931.225: 93.8915% ( 5) 00:07:35.855 9931.225 - 9981.637: 93.9368% ( 8) 00:07:35.855 9981.637 - 10032.049: 93.9821% ( 8) 00:07:35.855 10032.049 - 10082.462: 94.0500% ( 12) 00:07:35.855 10082.462 - 10132.874: 94.1010% ( 9) 00:07:35.855 10132.874 - 10183.286: 94.1406% ( 7) 00:07:35.855 10183.286 - 10233.698: 94.2029% ( 11) 00:07:35.855 10233.698 - 10284.111: 94.2425% ( 7) 00:07:35.855 10284.111 - 10334.523: 94.3161% ( 13) 00:07:35.855 10334.523 - 10384.935: 94.4010% ( 15) 00:07:35.855 10384.935 - 10435.348: 94.4860% ( 15) 00:07:35.855 10435.348 - 10485.760: 94.5765% ( 16) 00:07:35.855 10485.760 - 10536.172: 94.6615% ( 15) 00:07:35.855 10536.172 - 10586.585: 94.7690% ( 19) 00:07:35.855 10586.585 - 10636.997: 94.8653% ( 17) 00:07:35.855 10636.997 - 10687.409: 94.9728% ( 19) 00:07:35.855 10687.409 - 10737.822: 95.0634% ( 16) 00:07:35.855 10737.822 - 10788.234: 95.1710% ( 19) 00:07:35.855 10788.234 - 10838.646: 95.2785% ( 19) 00:07:35.855 10838.646 - 10889.058: 95.3748% ( 17) 00:07:35.855 10889.058 - 10939.471: 95.4710% ( 17) 00:07:35.855 10939.471 - 10989.883: 95.5729% ( 18) 00:07:35.855 10989.883 - 11040.295: 95.6692% ( 17) 00:07:35.855 11040.295 - 11090.708: 95.7824% ( 20) 00:07:35.855 11090.708 - 11141.120: 95.8843% ( 18) 00:07:35.855 11141.120 - 11191.532: 95.9805% ( 17) 00:07:35.855 11191.532 - 11241.945: 96.0711% ( 16) 00:07:35.855 11241.945 - 11292.357: 96.1843% ( 20) 00:07:35.855 11292.357 - 11342.769: 96.2749% ( 16) 00:07:35.855 11342.769 - 11393.182: 96.3542% ( 14) 00:07:35.855 11393.182 - 11443.594: 96.4334% ( 14) 00:07:35.855 11443.594 - 11494.006: 96.5183% ( 15) 00:07:35.855 11494.006 - 11544.418: 96.6089% ( 16) 00:07:35.855 11544.418 - 11594.831: 96.6825% ( 13) 00:07:35.855 11594.831 - 11645.243: 96.7618% ( 14) 00:07:35.855 11645.243 - 11695.655: 96.8240% ( 11) 00:07:35.855 11695.655 - 11746.068: 96.8807% ( 10) 00:07:35.855 11746.068 - 11796.480: 96.9316% ( 9) 00:07:35.855 11796.480 - 11846.892: 96.9769% ( 8) 00:07:35.855 11846.892 - 11897.305: 97.0222% ( 8) 00:07:35.855 11897.305 - 11947.717: 97.0958% ( 13) 00:07:35.855 11947.717 - 11998.129: 97.1581% ( 11) 00:07:35.855 11998.129 - 12048.542: 97.2203% ( 11) 00:07:35.855 12048.542 - 12098.954: 97.2713% ( 9) 00:07:35.855 12098.954 - 12149.366: 97.3222% ( 9) 00:07:35.855 12149.366 - 12199.778: 97.3845% ( 11) 00:07:35.855 12199.778 - 12250.191: 97.4411% ( 10) 00:07:35.855 12250.191 - 12300.603: 97.5091% ( 12) 00:07:35.855 12300.603 - 12351.015: 97.5657% ( 10) 00:07:35.855 12351.015 - 12401.428: 97.6166% ( 9) 00:07:35.855 12401.428 - 12451.840: 97.6789% ( 11) 00:07:35.855 12451.840 - 12502.252: 97.7355% ( 10) 00:07:35.855 12502.252 - 12552.665: 97.7865% ( 9) 00:07:35.855 12552.665 - 12603.077: 97.8487% ( 11) 00:07:35.855 12603.077 - 12653.489: 97.9053% ( 10) 00:07:35.855 12653.489 - 12703.902: 97.9563% ( 9) 00:07:35.855 12703.902 - 12754.314: 98.0129% ( 10) 00:07:35.855 12754.314 - 12804.726: 98.0752% ( 11) 00:07:35.855 12804.726 - 12855.138: 98.1375% ( 11) 00:07:35.855 12855.138 - 12905.551: 98.1997% ( 11) 00:07:35.855 12905.551 - 13006.375: 98.3073% ( 19) 00:07:35.855 13006.375 - 13107.200: 98.4318% ( 22) 00:07:35.855 13107.200 - 13208.025: 98.5224% ( 16) 00:07:35.855 13208.025 - 13308.849: 98.6187% ( 17) 00:07:35.855 13308.849 - 13409.674: 98.7149% ( 17) 00:07:35.855 13409.674 - 13510.498: 98.7885% ( 13) 00:07:35.855 13510.498 - 13611.323: 98.8225% ( 6) 00:07:35.855 13611.323 - 13712.148: 98.8564% ( 6) 00:07:35.855 13712.148 - 13812.972: 98.8904% ( 6) 00:07:35.855 13812.972 - 13913.797: 98.9130% ( 4) 00:07:35.855 15022.868 - 15123.692: 98.9357% ( 4) 00:07:35.855 15123.692 - 15224.517: 98.9640% ( 5) 00:07:35.855 15224.517 - 15325.342: 99.0036% ( 7) 00:07:35.855 15325.342 - 15426.166: 99.0433% ( 7) 00:07:35.855 15426.166 - 15526.991: 99.0829% ( 7) 00:07:35.855 15526.991 - 15627.815: 99.1225% ( 7) 00:07:35.855 15627.815 - 15728.640: 99.1621% ( 7) 00:07:35.855 15728.640 - 15829.465: 99.2018% ( 7) 00:07:35.855 15829.465 - 15930.289: 99.2414% ( 7) 00:07:35.855 15930.289 - 16031.114: 99.2754% ( 6) 00:07:35.855 20164.923 - 20265.748: 99.2867% ( 2) 00:07:35.855 20265.748 - 20366.572: 99.3037% ( 3) 00:07:35.855 20366.572 - 20467.397: 99.3263% ( 4) 00:07:35.855 20467.397 - 20568.222: 99.3433% ( 3) 00:07:35.855 20568.222 - 20669.046: 99.3716% ( 5) 00:07:35.855 20669.046 - 20769.871: 99.3886% ( 3) 00:07:35.855 20769.871 - 20870.695: 99.4056% ( 3) 00:07:35.855 20870.695 - 20971.520: 99.4282% ( 4) 00:07:35.855 20971.520 - 21072.345: 99.4452% ( 3) 00:07:35.855 21072.345 - 21173.169: 99.4678% ( 4) 00:07:35.855 21173.169 - 21273.994: 99.4848% ( 3) 00:07:35.856 21273.994 - 21374.818: 99.5075% ( 4) 00:07:35.856 21374.818 - 21475.643: 99.5245% ( 3) 00:07:35.856 21475.643 - 21576.468: 99.5414% ( 3) 00:07:35.856 21576.468 - 21677.292: 99.5641% ( 4) 00:07:35.856 21677.292 - 21778.117: 99.5867% ( 4) 00:07:35.856 21778.117 - 21878.942: 99.6037% ( 3) 00:07:35.856 21878.942 - 21979.766: 99.6264% ( 4) 00:07:35.856 21979.766 - 22080.591: 99.6377% ( 2) 00:07:35.856 28634.191 - 28835.840: 99.6547% ( 3) 00:07:35.856 28835.840 - 29037.489: 99.6943% ( 7) 00:07:35.856 29037.489 - 29239.138: 99.7339% ( 7) 00:07:35.856 29239.138 - 29440.788: 99.7736% ( 7) 00:07:35.856 29440.788 - 29642.437: 99.8132% ( 7) 00:07:35.856 29642.437 - 29844.086: 99.8471% ( 6) 00:07:35.856 29844.086 - 30045.735: 99.8924% ( 8) 00:07:35.856 30045.735 - 30247.385: 99.9264% ( 6) 00:07:35.856 30247.385 - 30449.034: 99.9660% ( 7) 00:07:35.856 30449.034 - 30650.683: 100.0000% ( 6) 00:07:35.856 00:07:35.856 Latency histogram for PCIE (0000:00:12.0) NSID 1 from core 0: 00:07:35.856 ============================================================================== 00:07:35.856 Range in us Cumulative IO count 00:07:35.856 3831.335 - 3856.542: 0.0113% ( 2) 00:07:35.856 3856.542 - 3881.748: 0.0453% ( 6) 00:07:35.856 3881.748 - 3906.954: 0.0510% ( 1) 00:07:35.856 3906.954 - 3932.160: 0.0623% ( 2) 00:07:35.856 3932.160 - 3957.366: 0.0679% ( 1) 00:07:35.856 3957.366 - 3982.572: 0.0849% ( 3) 00:07:35.856 3982.572 - 4007.778: 0.0962% ( 2) 00:07:35.856 4007.778 - 4032.985: 0.1076% ( 2) 00:07:35.856 4032.985 - 4058.191: 0.1189% ( 2) 00:07:35.856 4058.191 - 4083.397: 0.1359% ( 3) 00:07:35.856 4083.397 - 4108.603: 0.1472% ( 2) 00:07:35.856 4108.603 - 4133.809: 0.1585% ( 2) 00:07:35.856 4133.809 - 4159.015: 0.1698% ( 2) 00:07:35.856 4159.015 - 4184.222: 0.1812% ( 2) 00:07:35.856 4184.222 - 4209.428: 0.1981% ( 3) 00:07:35.856 4209.428 - 4234.634: 0.2038% ( 1) 00:07:35.856 4234.634 - 4259.840: 0.2151% ( 2) 00:07:35.856 4259.840 - 4285.046: 0.2321% ( 3) 00:07:35.856 4285.046 - 4310.252: 0.2434% ( 2) 00:07:35.856 4310.252 - 4335.458: 0.2548% ( 2) 00:07:35.856 4335.458 - 4360.665: 0.2661% ( 2) 00:07:35.856 4360.665 - 4385.871: 0.2774% ( 2) 00:07:35.856 4385.871 - 4411.077: 0.2944% ( 3) 00:07:35.856 4411.077 - 4436.283: 0.3057% ( 2) 00:07:35.856 4436.283 - 4461.489: 0.3170% ( 2) 00:07:35.856 4461.489 - 4486.695: 0.3284% ( 2) 00:07:35.856 4486.695 - 4511.902: 0.3397% ( 2) 00:07:35.856 4511.902 - 4537.108: 0.3567% ( 3) 00:07:35.856 4537.108 - 4562.314: 0.3623% ( 1) 00:07:35.856 5444.529 - 5469.735: 0.3736% ( 2) 00:07:35.856 5469.735 - 5494.942: 0.3906% ( 3) 00:07:35.856 5494.942 - 5520.148: 0.4416% ( 9) 00:07:35.856 5520.148 - 5545.354: 0.4925% ( 9) 00:07:35.856 5545.354 - 5570.560: 0.4982% ( 1) 00:07:35.856 5570.560 - 5595.766: 0.5152% ( 3) 00:07:35.856 5595.766 - 5620.972: 0.5208% ( 1) 00:07:35.856 5620.972 - 5646.178: 0.5265% ( 1) 00:07:35.856 5671.385 - 5696.591: 0.5322% ( 1) 00:07:35.856 5696.591 - 5721.797: 0.5435% ( 2) 00:07:35.856 5721.797 - 5747.003: 0.5548% ( 2) 00:07:35.856 5747.003 - 5772.209: 0.5605% ( 1) 00:07:35.856 5772.209 - 5797.415: 0.5774% ( 3) 00:07:35.856 5797.415 - 5822.622: 0.5831% ( 1) 00:07:35.856 5822.622 - 5847.828: 0.6001% ( 3) 00:07:35.856 5847.828 - 5873.034: 0.6114% ( 2) 00:07:35.856 5873.034 - 5898.240: 0.6227% ( 2) 00:07:35.856 5898.240 - 5923.446: 0.6341% ( 2) 00:07:35.856 5923.446 - 5948.652: 0.6454% ( 2) 00:07:35.856 5948.652 - 5973.858: 0.6567% ( 2) 00:07:35.856 5973.858 - 5999.065: 0.6737% ( 3) 00:07:35.856 5999.065 - 6024.271: 0.6850% ( 2) 00:07:35.856 6024.271 - 6049.477: 0.6963% ( 2) 00:07:35.856 6049.477 - 6074.683: 0.7246% ( 5) 00:07:35.856 6074.683 - 6099.889: 0.7756% ( 9) 00:07:35.856 6099.889 - 6125.095: 0.8492% ( 13) 00:07:35.856 6125.095 - 6150.302: 1.0077% ( 28) 00:07:35.856 6150.302 - 6175.508: 1.3700% ( 64) 00:07:35.856 6175.508 - 6200.714: 1.7663% ( 70) 00:07:35.856 6200.714 - 6225.920: 2.3211% ( 98) 00:07:35.856 6225.920 - 6251.126: 3.0344% ( 126) 00:07:35.856 6251.126 - 6276.332: 4.0534% ( 180) 00:07:35.856 6276.332 - 6301.538: 5.1857% ( 200) 00:07:35.856 6301.538 - 6326.745: 6.4821% ( 229) 00:07:35.856 6326.745 - 6351.951: 7.8691% ( 245) 00:07:35.856 6351.951 - 6377.157: 9.5788% ( 302) 00:07:35.856 6377.157 - 6402.363: 11.6848% ( 372) 00:07:35.856 6402.363 - 6427.569: 13.9040% ( 392) 00:07:35.856 6427.569 - 6452.775: 16.1855% ( 403) 00:07:35.856 6452.775 - 6503.188: 20.3918% ( 743) 00:07:35.856 6503.188 - 6553.600: 24.7792% ( 775) 00:07:35.856 6553.600 - 6604.012: 28.9459% ( 736) 00:07:35.856 6604.012 - 6654.425: 33.3786% ( 783) 00:07:35.856 6654.425 - 6704.837: 37.7321% ( 769) 00:07:35.856 6704.837 - 6755.249: 42.4762% ( 838) 00:07:35.856 6755.249 - 6805.662: 47.0448% ( 807) 00:07:35.856 6805.662 - 6856.074: 51.6644% ( 816) 00:07:35.856 6856.074 - 6906.486: 56.3802% ( 833) 00:07:35.856 6906.486 - 6956.898: 61.0394% ( 823) 00:07:35.856 6956.898 - 7007.311: 65.7326% ( 829) 00:07:35.856 7007.311 - 7057.723: 70.2672% ( 801) 00:07:35.856 7057.723 - 7108.135: 74.4622% ( 741) 00:07:35.856 7108.135 - 7158.548: 78.0854% ( 640) 00:07:35.856 7158.548 - 7208.960: 81.1368% ( 539) 00:07:35.856 7208.960 - 7259.372: 83.5711% ( 430) 00:07:35.856 7259.372 - 7309.785: 85.3431% ( 313) 00:07:35.856 7309.785 - 7360.197: 86.4923% ( 203) 00:07:35.856 7360.197 - 7410.609: 87.3528% ( 152) 00:07:35.856 7410.609 - 7461.022: 88.1171% ( 135) 00:07:35.856 7461.022 - 7511.434: 88.7964% ( 120) 00:07:35.856 7511.434 - 7561.846: 89.3059% ( 90) 00:07:35.856 7561.846 - 7612.258: 89.8211% ( 91) 00:07:35.856 7612.258 - 7662.671: 90.1778% ( 63) 00:07:35.856 7662.671 - 7713.083: 90.5174% ( 60) 00:07:35.856 7713.083 - 7763.495: 90.7552% ( 42) 00:07:35.856 7763.495 - 7813.908: 90.9930% ( 42) 00:07:35.856 7813.908 - 7864.320: 91.1968% ( 36) 00:07:35.856 7864.320 - 7914.732: 91.3779% ( 32) 00:07:35.856 7914.732 - 7965.145: 91.5591% ( 32) 00:07:35.856 7965.145 - 8015.557: 91.7233% ( 29) 00:07:35.856 8015.557 - 8065.969: 91.8648% ( 25) 00:07:35.856 8065.969 - 8116.382: 91.9837% ( 21) 00:07:35.856 8116.382 - 8166.794: 92.0969% ( 20) 00:07:35.856 8166.794 - 8217.206: 92.1932% ( 17) 00:07:35.856 8217.206 - 8267.618: 92.2668% ( 13) 00:07:35.856 8267.618 - 8318.031: 92.3460% ( 14) 00:07:35.856 8318.031 - 8368.443: 92.3913% ( 8) 00:07:35.856 8368.443 - 8418.855: 92.4423% ( 9) 00:07:35.856 8418.855 - 8469.268: 92.4706% ( 5) 00:07:35.856 8469.268 - 8519.680: 92.4989% ( 5) 00:07:35.856 8519.680 - 8570.092: 92.5272% ( 5) 00:07:35.856 8570.092 - 8620.505: 92.5385% ( 2) 00:07:35.856 8620.505 - 8670.917: 92.5555% ( 3) 00:07:35.856 8670.917 - 8721.329: 92.5951% ( 7) 00:07:35.856 8721.329 - 8771.742: 92.6461% ( 9) 00:07:35.856 8771.742 - 8822.154: 92.6744% ( 5) 00:07:35.856 8822.154 - 8872.566: 92.7536% ( 14) 00:07:35.856 8872.566 - 8922.978: 92.7989% ( 8) 00:07:35.856 8922.978 - 8973.391: 92.8499% ( 9) 00:07:35.856 8973.391 - 9023.803: 92.9291% ( 14) 00:07:35.856 9023.803 - 9074.215: 92.9971% ( 12) 00:07:35.856 9074.215 - 9124.628: 93.0707% ( 13) 00:07:35.856 9124.628 - 9175.040: 93.1442% ( 13) 00:07:35.856 9175.040 - 9225.452: 93.2122% ( 12) 00:07:35.856 9225.452 - 9275.865: 93.2914% ( 14) 00:07:35.856 9275.865 - 9326.277: 93.3537% ( 11) 00:07:35.856 9326.277 - 9376.689: 93.4330% ( 14) 00:07:35.856 9376.689 - 9427.102: 93.4952% ( 11) 00:07:35.856 9427.102 - 9477.514: 93.5575% ( 11) 00:07:35.856 9477.514 - 9527.926: 93.6141% ( 10) 00:07:35.856 9527.926 - 9578.338: 93.6594% ( 8) 00:07:35.856 9578.338 - 9628.751: 93.6934% ( 6) 00:07:35.856 9628.751 - 9679.163: 93.7387% ( 8) 00:07:35.856 9679.163 - 9729.575: 93.7670% ( 5) 00:07:35.856 9729.575 - 9779.988: 93.7896% ( 4) 00:07:35.856 9779.988 - 9830.400: 93.8066% ( 3) 00:07:35.856 9830.400 - 9880.812: 93.8293% ( 4) 00:07:35.856 9880.812 - 9931.225: 93.8406% ( 2) 00:07:35.856 9981.637 - 10032.049: 93.8576% ( 3) 00:07:35.856 10032.049 - 10082.462: 93.8745% ( 3) 00:07:35.856 10082.462 - 10132.874: 93.9481% ( 13) 00:07:35.856 10132.874 - 10183.286: 94.0104% ( 11) 00:07:35.856 10183.286 - 10233.698: 94.1293% ( 21) 00:07:35.856 10233.698 - 10284.111: 94.2255% ( 17) 00:07:35.856 10284.111 - 10334.523: 94.3444% ( 21) 00:07:35.856 10334.523 - 10384.935: 94.4633% ( 21) 00:07:35.856 10384.935 - 10435.348: 94.5652% ( 18) 00:07:35.856 10435.348 - 10485.760: 94.6954% ( 23) 00:07:35.856 10485.760 - 10536.172: 94.8370% ( 25) 00:07:35.856 10536.172 - 10586.585: 94.9558% ( 21) 00:07:35.856 10586.585 - 10636.997: 95.0691% ( 20) 00:07:35.856 10636.997 - 10687.409: 95.1993% ( 23) 00:07:35.856 10687.409 - 10737.822: 95.3295% ( 23) 00:07:35.856 10737.822 - 10788.234: 95.4484% ( 21) 00:07:35.856 10788.234 - 10838.646: 95.5559% ( 19) 00:07:35.856 10838.646 - 10889.058: 95.6975% ( 25) 00:07:35.856 10889.058 - 10939.471: 95.8390% ( 25) 00:07:35.856 10939.471 - 10989.883: 95.9692% ( 23) 00:07:35.856 10989.883 - 11040.295: 96.1051% ( 24) 00:07:35.856 11040.295 - 11090.708: 96.2409% ( 24) 00:07:35.856 11090.708 - 11141.120: 96.3315% ( 16) 00:07:35.856 11141.120 - 11191.532: 96.4108% ( 14) 00:07:35.856 11191.532 - 11241.945: 96.4787% ( 12) 00:07:35.856 11241.945 - 11292.357: 96.5466% ( 12) 00:07:35.856 11292.357 - 11342.769: 96.6146% ( 12) 00:07:35.856 11342.769 - 11393.182: 96.6938% ( 14) 00:07:35.856 11393.182 - 11443.594: 96.7618% ( 12) 00:07:35.856 11443.594 - 11494.006: 96.8071% ( 8) 00:07:35.856 11494.006 - 11544.418: 96.8580% ( 9) 00:07:35.856 11544.418 - 11594.831: 96.8976% ( 7) 00:07:35.856 11594.831 - 11645.243: 96.9260% ( 5) 00:07:35.856 11645.243 - 11695.655: 96.9599% ( 6) 00:07:35.856 11695.655 - 11746.068: 96.9826% ( 4) 00:07:35.856 11746.068 - 11796.480: 97.0165% ( 6) 00:07:35.857 11796.480 - 11846.892: 97.0448% ( 5) 00:07:35.857 11846.892 - 11897.305: 97.0731% ( 5) 00:07:35.857 11897.305 - 11947.717: 97.1071% ( 6) 00:07:35.857 11947.717 - 11998.129: 97.1354% ( 5) 00:07:35.857 11998.129 - 12048.542: 97.1637% ( 5) 00:07:35.857 12048.542 - 12098.954: 97.1920% ( 5) 00:07:35.857 12098.954 - 12149.366: 97.2147% ( 4) 00:07:35.857 12149.366 - 12199.778: 97.2486% ( 6) 00:07:35.857 12199.778 - 12250.191: 97.2939% ( 8) 00:07:35.857 12250.191 - 12300.603: 97.3619% ( 12) 00:07:35.857 12300.603 - 12351.015: 97.4355% ( 13) 00:07:35.857 12351.015 - 12401.428: 97.5091% ( 13) 00:07:35.857 12401.428 - 12451.840: 97.5657% ( 10) 00:07:35.857 12451.840 - 12502.252: 97.6110% ( 8) 00:07:35.857 12502.252 - 12552.665: 97.6619% ( 9) 00:07:35.857 12552.665 - 12603.077: 97.7185% ( 10) 00:07:35.857 12603.077 - 12653.489: 97.7808% ( 11) 00:07:35.857 12653.489 - 12703.902: 97.8261% ( 8) 00:07:35.857 12703.902 - 12754.314: 97.8827% ( 10) 00:07:35.857 12754.314 - 12804.726: 97.9393% ( 10) 00:07:35.857 12804.726 - 12855.138: 97.9959% ( 10) 00:07:35.857 12855.138 - 12905.551: 98.0525% ( 10) 00:07:35.857 12905.551 - 13006.375: 98.1771% ( 22) 00:07:35.857 13006.375 - 13107.200: 98.3356% ( 28) 00:07:35.857 13107.200 - 13208.025: 98.4432% ( 19) 00:07:35.857 13208.025 - 13308.849: 98.5224% ( 14) 00:07:35.857 13308.849 - 13409.674: 98.5960% ( 13) 00:07:35.857 13409.674 - 13510.498: 98.6696% ( 13) 00:07:35.857 13510.498 - 13611.323: 98.7489% ( 14) 00:07:35.857 13611.323 - 13712.148: 98.8225% ( 13) 00:07:35.857 13712.148 - 13812.972: 98.8734% ( 9) 00:07:35.857 13812.972 - 13913.797: 98.9130% ( 7) 00:07:35.857 15123.692 - 15224.517: 98.9413% ( 5) 00:07:35.857 15224.517 - 15325.342: 98.9866% ( 8) 00:07:35.857 15325.342 - 15426.166: 99.0319% ( 8) 00:07:35.857 15426.166 - 15526.991: 99.0659% ( 6) 00:07:35.857 15526.991 - 15627.815: 99.1055% ( 7) 00:07:35.857 15627.815 - 15728.640: 99.1395% ( 6) 00:07:35.857 15728.640 - 15829.465: 99.1791% ( 7) 00:07:35.857 15829.465 - 15930.289: 99.2131% ( 6) 00:07:35.857 15930.289 - 16031.114: 99.2527% ( 7) 00:07:35.857 16031.114 - 16131.938: 99.2754% ( 4) 00:07:35.857 20164.923 - 20265.748: 99.2867% ( 2) 00:07:35.857 20265.748 - 20366.572: 99.3093% ( 4) 00:07:35.857 20366.572 - 20467.397: 99.3263% ( 3) 00:07:35.857 20467.397 - 20568.222: 99.3490% ( 4) 00:07:35.857 20568.222 - 20669.046: 99.3716% ( 4) 00:07:35.857 20669.046 - 20769.871: 99.3942% ( 4) 00:07:35.857 20769.871 - 20870.695: 99.4112% ( 3) 00:07:35.857 20870.695 - 20971.520: 99.4282% ( 3) 00:07:35.857 20971.520 - 21072.345: 99.4395% ( 2) 00:07:35.857 21072.345 - 21173.169: 99.4622% ( 4) 00:07:35.857 21173.169 - 21273.994: 99.4848% ( 4) 00:07:35.857 21273.994 - 21374.818: 99.5018% ( 3) 00:07:35.857 21374.818 - 21475.643: 99.5188% ( 3) 00:07:35.857 21475.643 - 21576.468: 99.5414% ( 4) 00:07:35.857 21576.468 - 21677.292: 99.5584% ( 3) 00:07:35.857 21677.292 - 21778.117: 99.5811% ( 4) 00:07:35.857 21778.117 - 21878.942: 99.6037% ( 4) 00:07:35.857 21878.942 - 21979.766: 99.6207% ( 3) 00:07:35.857 21979.766 - 22080.591: 99.6377% ( 3) 00:07:35.857 28634.191 - 28835.840: 99.6603% ( 4) 00:07:35.857 28835.840 - 29037.489: 99.6886% ( 5) 00:07:35.857 29037.489 - 29239.138: 99.7283% ( 7) 00:07:35.857 29239.138 - 29440.788: 99.7679% ( 7) 00:07:35.857 29440.788 - 29642.437: 99.8075% ( 7) 00:07:35.857 29642.437 - 29844.086: 99.8698% ( 11) 00:07:35.857 29844.086 - 30045.735: 99.9377% ( 12) 00:07:35.857 30045.735 - 30247.385: 99.9943% ( 10) 00:07:35.857 30247.385 - 30449.034: 100.0000% ( 1) 00:07:35.857 00:07:35.857 Latency histogram for PCIE (0000:00:12.0) NSID 2 from core 0: 00:07:35.857 ============================================================================== 00:07:35.857 Range in us Cumulative IO count 00:07:35.857 3629.686 - 3654.892: 0.0170% ( 3) 00:07:35.857 3654.892 - 3680.098: 0.0453% ( 5) 00:07:35.857 3680.098 - 3705.305: 0.0623% ( 3) 00:07:35.857 3705.305 - 3730.511: 0.0679% ( 1) 00:07:35.857 3730.511 - 3755.717: 0.0793% ( 2) 00:07:35.857 3755.717 - 3780.923: 0.0906% ( 2) 00:07:35.857 3780.923 - 3806.129: 0.1019% ( 2) 00:07:35.857 3806.129 - 3831.335: 0.1132% ( 2) 00:07:35.857 3831.335 - 3856.542: 0.1245% ( 2) 00:07:35.857 3856.542 - 3881.748: 0.1359% ( 2) 00:07:35.857 3881.748 - 3906.954: 0.1472% ( 2) 00:07:35.857 3906.954 - 3932.160: 0.1585% ( 2) 00:07:35.857 3932.160 - 3957.366: 0.1755% ( 3) 00:07:35.857 3957.366 - 3982.572: 0.1868% ( 2) 00:07:35.857 3982.572 - 4007.778: 0.1925% ( 1) 00:07:35.857 4007.778 - 4032.985: 0.2038% ( 2) 00:07:35.857 4032.985 - 4058.191: 0.2151% ( 2) 00:07:35.857 4058.191 - 4083.397: 0.2264% ( 2) 00:07:35.857 4083.397 - 4108.603: 0.2378% ( 2) 00:07:35.857 4108.603 - 4133.809: 0.2548% ( 3) 00:07:35.857 4133.809 - 4159.015: 0.2661% ( 2) 00:07:35.857 4159.015 - 4184.222: 0.2774% ( 2) 00:07:35.857 4184.222 - 4209.428: 0.2887% ( 2) 00:07:35.857 4209.428 - 4234.634: 0.3000% ( 2) 00:07:35.857 4234.634 - 4259.840: 0.3114% ( 2) 00:07:35.857 4259.840 - 4285.046: 0.3227% ( 2) 00:07:35.857 4285.046 - 4310.252: 0.3397% ( 3) 00:07:35.857 4310.252 - 4335.458: 0.3510% ( 2) 00:07:35.857 4335.458 - 4360.665: 0.3623% ( 2) 00:07:35.857 5192.468 - 5217.674: 0.3793% ( 3) 00:07:35.857 5217.674 - 5242.880: 0.3963% ( 3) 00:07:35.857 5242.880 - 5268.086: 0.4076% ( 2) 00:07:35.857 5268.086 - 5293.292: 0.4133% ( 1) 00:07:35.857 5293.292 - 5318.498: 0.4303% ( 3) 00:07:35.857 5318.498 - 5343.705: 0.4359% ( 1) 00:07:35.857 5343.705 - 5368.911: 0.4529% ( 3) 00:07:35.857 5368.911 - 5394.117: 0.4755% ( 4) 00:07:35.857 5394.117 - 5419.323: 0.4869% ( 2) 00:07:35.857 5419.323 - 5444.529: 0.4982% ( 2) 00:07:35.857 5444.529 - 5469.735: 0.5095% ( 2) 00:07:35.857 5469.735 - 5494.942: 0.5208% ( 2) 00:07:35.857 5494.942 - 5520.148: 0.5322% ( 2) 00:07:35.857 5520.148 - 5545.354: 0.5435% ( 2) 00:07:35.857 5545.354 - 5570.560: 0.5605% ( 3) 00:07:35.857 5570.560 - 5595.766: 0.5661% ( 1) 00:07:35.857 5595.766 - 5620.972: 0.5831% ( 3) 00:07:35.857 5620.972 - 5646.178: 0.5944% ( 2) 00:07:35.857 5646.178 - 5671.385: 0.6058% ( 2) 00:07:35.857 5671.385 - 5696.591: 0.6171% ( 2) 00:07:35.857 5696.591 - 5721.797: 0.6341% ( 3) 00:07:35.857 5721.797 - 5747.003: 0.6454% ( 2) 00:07:35.857 5747.003 - 5772.209: 0.6567% ( 2) 00:07:35.857 5772.209 - 5797.415: 0.6680% ( 2) 00:07:35.857 5797.415 - 5822.622: 0.6793% ( 2) 00:07:35.857 5822.622 - 5847.828: 0.6963% ( 3) 00:07:35.857 5847.828 - 5873.034: 0.7077% ( 2) 00:07:35.857 5873.034 - 5898.240: 0.7190% ( 2) 00:07:35.857 5898.240 - 5923.446: 0.7246% ( 1) 00:07:35.857 6049.477 - 6074.683: 0.7416% ( 3) 00:07:35.857 6074.683 - 6099.889: 0.8209% ( 14) 00:07:35.857 6099.889 - 6125.095: 0.9454% ( 22) 00:07:35.857 6125.095 - 6150.302: 1.1379% ( 34) 00:07:35.857 6150.302 - 6175.508: 1.4889% ( 62) 00:07:35.857 6175.508 - 6200.714: 1.8682% ( 67) 00:07:35.857 6200.714 - 6225.920: 2.4796% ( 108) 00:07:35.857 6225.920 - 6251.126: 3.3628% ( 156) 00:07:35.857 6251.126 - 6276.332: 4.3875% ( 181) 00:07:35.857 6276.332 - 6301.538: 5.5197% ( 200) 00:07:35.857 6301.538 - 6326.745: 6.8784% ( 240) 00:07:35.857 6326.745 - 6351.951: 8.1918% ( 232) 00:07:35.857 6351.951 - 6377.157: 10.1336% ( 343) 00:07:35.857 6377.157 - 6402.363: 12.1433% ( 355) 00:07:35.857 6402.363 - 6427.569: 14.1984% ( 363) 00:07:35.857 6427.569 - 6452.775: 16.1741% ( 349) 00:07:35.857 6452.775 - 6503.188: 20.3351% ( 735) 00:07:35.857 6503.188 - 6553.600: 24.7339% ( 777) 00:07:35.857 6553.600 - 6604.012: 28.8836% ( 733) 00:07:35.857 6604.012 - 6654.425: 33.2880% ( 778) 00:07:35.857 6654.425 - 6704.837: 37.7831% ( 794) 00:07:35.857 6704.837 - 6755.249: 42.4196% ( 819) 00:07:35.857 6755.249 - 6805.662: 47.0731% ( 822) 00:07:35.857 6805.662 - 6856.074: 51.6135% ( 802) 00:07:35.857 6856.074 - 6906.486: 56.2670% ( 822) 00:07:35.857 6906.486 - 6956.898: 60.9149% ( 821) 00:07:35.857 6956.898 - 7007.311: 65.6363% ( 834) 00:07:35.857 7007.311 - 7057.723: 70.0747% ( 784) 00:07:35.857 7057.723 - 7108.135: 74.1565% ( 721) 00:07:35.857 7108.135 - 7158.548: 77.7004% ( 626) 00:07:35.857 7158.548 - 7208.960: 80.7575% ( 540) 00:07:35.857 7208.960 - 7259.372: 83.2767% ( 445) 00:07:35.857 7259.372 - 7309.785: 85.0940% ( 321) 00:07:35.857 7309.785 - 7360.197: 86.3338% ( 219) 00:07:35.857 7360.197 - 7410.609: 87.2226% ( 157) 00:07:35.857 7410.609 - 7461.022: 88.0038% ( 138) 00:07:35.857 7461.022 - 7511.434: 88.6322% ( 111) 00:07:35.857 7511.434 - 7561.846: 89.1587% ( 93) 00:07:35.857 7561.846 - 7612.258: 89.6399% ( 85) 00:07:35.857 7612.258 - 7662.671: 90.0419% ( 71) 00:07:35.857 7662.671 - 7713.083: 90.3476% ( 54) 00:07:35.857 7713.083 - 7763.495: 90.6476% ( 53) 00:07:35.857 7763.495 - 7813.908: 90.8967% ( 44) 00:07:35.857 7813.908 - 7864.320: 91.1232% ( 40) 00:07:35.857 7864.320 - 7914.732: 91.3779% ( 45) 00:07:35.857 7914.732 - 7965.145: 91.5704% ( 34) 00:07:35.857 7965.145 - 8015.557: 91.7516% ( 32) 00:07:35.857 8015.557 - 8065.969: 91.9044% ( 27) 00:07:35.857 8065.969 - 8116.382: 92.0177% ( 20) 00:07:35.857 8116.382 - 8166.794: 92.1196% ( 18) 00:07:35.857 8166.794 - 8217.206: 92.2101% ( 16) 00:07:35.857 8217.206 - 8267.618: 92.2611% ( 9) 00:07:35.857 8267.618 - 8318.031: 92.3064% ( 8) 00:07:35.857 8318.031 - 8368.443: 92.3347% ( 5) 00:07:35.857 8368.443 - 8418.855: 92.3630% ( 5) 00:07:35.857 8418.855 - 8469.268: 92.3913% ( 5) 00:07:35.857 8469.268 - 8519.680: 92.4026% ( 2) 00:07:35.857 8519.680 - 8570.092: 92.4139% ( 2) 00:07:35.857 8570.092 - 8620.505: 92.4309% ( 3) 00:07:35.857 8620.505 - 8670.917: 92.4423% ( 2) 00:07:35.857 8670.917 - 8721.329: 92.4592% ( 3) 00:07:35.857 8721.329 - 8771.742: 92.4706% ( 2) 00:07:35.858 8771.742 - 8822.154: 92.4875% ( 3) 00:07:35.858 8822.154 - 8872.566: 92.4989% ( 2) 00:07:35.858 8872.566 - 8922.978: 92.5102% ( 2) 00:07:35.858 8922.978 - 8973.391: 92.5215% ( 2) 00:07:35.858 8973.391 - 9023.803: 92.5385% ( 3) 00:07:35.858 9023.803 - 9074.215: 92.5611% ( 4) 00:07:35.858 9074.215 - 9124.628: 92.5951% ( 6) 00:07:35.858 9124.628 - 9175.040: 92.6291% ( 6) 00:07:35.858 9175.040 - 9225.452: 92.6913% ( 11) 00:07:35.858 9225.452 - 9275.865: 92.7366% ( 8) 00:07:35.858 9275.865 - 9326.277: 92.7989% ( 11) 00:07:35.858 9326.277 - 9376.689: 92.8442% ( 8) 00:07:35.858 9376.689 - 9427.102: 92.9631% ( 21) 00:07:35.858 9427.102 - 9477.514: 93.0197% ( 10) 00:07:35.858 9477.514 - 9527.926: 93.0933% ( 13) 00:07:35.858 9527.926 - 9578.338: 93.1612% ( 12) 00:07:35.858 9578.338 - 9628.751: 93.2801% ( 21) 00:07:35.858 9628.751 - 9679.163: 93.3764% ( 17) 00:07:35.858 9679.163 - 9729.575: 93.5292% ( 27) 00:07:35.858 9729.575 - 9779.988: 93.6594% ( 23) 00:07:35.858 9779.988 - 9830.400: 93.8179% ( 28) 00:07:35.858 9830.400 - 9880.812: 93.9651% ( 26) 00:07:35.858 9880.812 - 9931.225: 94.1236% ( 28) 00:07:35.858 9931.225 - 9981.637: 94.2652% ( 25) 00:07:35.858 9981.637 - 10032.049: 94.4067% ( 25) 00:07:35.858 10032.049 - 10082.462: 94.5312% ( 22) 00:07:35.858 10082.462 - 10132.874: 94.6501% ( 21) 00:07:35.858 10132.874 - 10183.286: 94.7577% ( 19) 00:07:35.858 10183.286 - 10233.698: 94.8822% ( 22) 00:07:35.858 10233.698 - 10284.111: 94.9615% ( 14) 00:07:35.858 10284.111 - 10334.523: 95.0634% ( 18) 00:07:35.858 10334.523 - 10384.935: 95.1710% ( 19) 00:07:35.858 10384.935 - 10435.348: 95.2672% ( 17) 00:07:35.858 10435.348 - 10485.760: 95.3578% ( 16) 00:07:35.858 10485.760 - 10536.172: 95.4654% ( 19) 00:07:35.858 10536.172 - 10586.585: 95.5616% ( 17) 00:07:35.858 10586.585 - 10636.997: 95.6635% ( 18) 00:07:35.858 10636.997 - 10687.409: 95.7994% ( 24) 00:07:35.858 10687.409 - 10737.822: 95.9013% ( 18) 00:07:35.858 10737.822 - 10788.234: 95.9805% ( 14) 00:07:35.858 10788.234 - 10838.646: 96.0428% ( 11) 00:07:35.858 10838.646 - 10889.058: 96.1334% ( 16) 00:07:35.858 10889.058 - 10939.471: 96.2070% ( 13) 00:07:35.858 10939.471 - 10989.883: 96.2976% ( 16) 00:07:35.858 10989.883 - 11040.295: 96.3655% ( 12) 00:07:35.858 11040.295 - 11090.708: 96.4334% ( 12) 00:07:35.858 11090.708 - 11141.120: 96.4957% ( 11) 00:07:35.858 11141.120 - 11191.532: 96.5466% ( 9) 00:07:35.858 11191.532 - 11241.945: 96.6089% ( 11) 00:07:35.858 11241.945 - 11292.357: 96.6542% ( 8) 00:07:35.858 11292.357 - 11342.769: 96.7052% ( 9) 00:07:35.858 11342.769 - 11393.182: 96.7505% ( 8) 00:07:35.858 11393.182 - 11443.594: 96.7957% ( 8) 00:07:35.858 11443.594 - 11494.006: 96.8410% ( 8) 00:07:35.858 11494.006 - 11544.418: 96.8807% ( 7) 00:07:35.858 11544.418 - 11594.831: 96.9090% ( 5) 00:07:35.858 11594.831 - 11645.243: 96.9260% ( 3) 00:07:35.858 11645.243 - 11695.655: 96.9543% ( 5) 00:07:35.858 11695.655 - 11746.068: 96.9769% ( 4) 00:07:35.858 11746.068 - 11796.480: 97.0109% ( 6) 00:07:35.858 11796.480 - 11846.892: 97.0448% ( 6) 00:07:35.858 11846.892 - 11897.305: 97.0845% ( 7) 00:07:35.858 11897.305 - 11947.717: 97.1128% ( 5) 00:07:35.858 11947.717 - 11998.129: 97.1524% ( 7) 00:07:35.858 11998.129 - 12048.542: 97.1637% ( 2) 00:07:35.858 12048.542 - 12098.954: 97.1750% ( 2) 00:07:35.858 12098.954 - 12149.366: 97.1807% ( 1) 00:07:35.858 12149.366 - 12199.778: 97.1920% ( 2) 00:07:35.858 12199.778 - 12250.191: 97.1977% ( 1) 00:07:35.858 12250.191 - 12300.603: 97.2090% ( 2) 00:07:35.858 12300.603 - 12351.015: 97.2486% ( 7) 00:07:35.858 12351.015 - 12401.428: 97.2939% ( 8) 00:07:35.858 12401.428 - 12451.840: 97.3562% ( 11) 00:07:35.858 12451.840 - 12502.252: 97.4185% ( 11) 00:07:35.858 12502.252 - 12552.665: 97.4864% ( 12) 00:07:35.858 12552.665 - 12603.077: 97.5430% ( 10) 00:07:35.858 12603.077 - 12653.489: 97.6110% ( 12) 00:07:35.858 12653.489 - 12703.902: 97.6732% ( 11) 00:07:35.858 12703.902 - 12754.314: 97.7525% ( 14) 00:07:35.858 12754.314 - 12804.726: 97.8374% ( 15) 00:07:35.858 12804.726 - 12855.138: 97.9393% ( 18) 00:07:35.858 12855.138 - 12905.551: 98.0129% ( 13) 00:07:35.858 12905.551 - 13006.375: 98.1884% ( 31) 00:07:35.858 13006.375 - 13107.200: 98.3526% ( 29) 00:07:35.858 13107.200 - 13208.025: 98.5281% ( 31) 00:07:35.858 13208.025 - 13308.849: 98.6696% ( 25) 00:07:35.858 13308.849 - 13409.674: 98.7602% ( 16) 00:07:35.858 13409.674 - 13510.498: 98.8451% ( 15) 00:07:35.858 13510.498 - 13611.323: 98.9017% ( 10) 00:07:35.858 13611.323 - 13712.148: 98.9130% ( 2) 00:07:35.858 14115.446 - 14216.271: 98.9300% ( 3) 00:07:35.858 14216.271 - 14317.095: 98.9697% ( 7) 00:07:35.858 14317.095 - 14417.920: 98.9810% ( 2) 00:07:35.858 14417.920 - 14518.745: 98.9980% ( 3) 00:07:35.858 14518.745 - 14619.569: 99.0149% ( 3) 00:07:35.858 14619.569 - 14720.394: 99.0433% ( 5) 00:07:35.858 14720.394 - 14821.218: 99.0602% ( 3) 00:07:35.858 14821.218 - 14922.043: 99.0829% ( 4) 00:07:35.858 14922.043 - 15022.868: 99.0999% ( 3) 00:07:35.858 15022.868 - 15123.692: 99.1225% ( 4) 00:07:35.858 15123.692 - 15224.517: 99.1395% ( 3) 00:07:35.858 15224.517 - 15325.342: 99.1565% ( 3) 00:07:35.858 15325.342 - 15426.166: 99.1791% ( 4) 00:07:35.858 15426.166 - 15526.991: 99.2018% ( 4) 00:07:35.858 15526.991 - 15627.815: 99.2131% ( 2) 00:07:35.858 15627.815 - 15728.640: 99.2301% ( 3) 00:07:35.858 15728.640 - 15829.465: 99.2527% ( 4) 00:07:35.858 15829.465 - 15930.289: 99.2697% ( 3) 00:07:35.858 15930.289 - 16031.114: 99.2754% ( 1) 00:07:35.858 20164.923 - 20265.748: 99.2980% ( 4) 00:07:35.858 20265.748 - 20366.572: 99.3150% ( 3) 00:07:35.858 20366.572 - 20467.397: 99.3320% ( 3) 00:07:35.858 20467.397 - 20568.222: 99.3490% ( 3) 00:07:35.858 20568.222 - 20669.046: 99.3716% ( 4) 00:07:35.858 20669.046 - 20769.871: 99.3886% ( 3) 00:07:35.858 20769.871 - 20870.695: 99.4169% ( 5) 00:07:35.858 20870.695 - 20971.520: 99.4339% ( 3) 00:07:35.858 20971.520 - 21072.345: 99.4565% ( 4) 00:07:35.858 21072.345 - 21173.169: 99.4735% ( 3) 00:07:35.858 21173.169 - 21273.994: 99.4905% ( 3) 00:07:35.858 21273.994 - 21374.818: 99.5131% ( 4) 00:07:35.858 21374.818 - 21475.643: 99.5301% ( 3) 00:07:35.858 21475.643 - 21576.468: 99.5528% ( 4) 00:07:35.858 21576.468 - 21677.292: 99.5697% ( 3) 00:07:35.858 21677.292 - 21778.117: 99.5924% ( 4) 00:07:35.858 21778.117 - 21878.942: 99.6094% ( 3) 00:07:35.858 21878.942 - 21979.766: 99.6264% ( 3) 00:07:35.858 21979.766 - 22080.591: 99.6377% ( 2) 00:07:35.858 28230.892 - 28432.542: 99.6716% ( 6) 00:07:35.858 28432.542 - 28634.191: 99.7339% ( 11) 00:07:35.858 28634.191 - 28835.840: 99.7962% ( 11) 00:07:35.858 28835.840 - 29037.489: 99.8585% ( 11) 00:07:35.858 29037.489 - 29239.138: 99.9264% ( 12) 00:07:35.858 29239.138 - 29440.788: 99.9887% ( 11) 00:07:35.858 29440.788 - 29642.437: 100.0000% ( 2) 00:07:35.858 00:07:35.858 Latency histogram for PCIE (0000:00:12.0) NSID 3 from core 0: 00:07:35.858 ============================================================================== 00:07:35.858 Range in us Cumulative IO count 00:07:35.858 3428.037 - 3453.243: 0.0340% ( 6) 00:07:35.858 3453.243 - 3478.449: 0.0453% ( 2) 00:07:35.858 3478.449 - 3503.655: 0.0566% ( 2) 00:07:35.858 3503.655 - 3528.862: 0.0679% ( 2) 00:07:35.858 3528.862 - 3554.068: 0.0736% ( 1) 00:07:35.858 3554.068 - 3579.274: 0.0906% ( 3) 00:07:35.858 3579.274 - 3604.480: 0.1076% ( 3) 00:07:35.858 3604.480 - 3629.686: 0.1132% ( 1) 00:07:35.858 3629.686 - 3654.892: 0.1245% ( 2) 00:07:35.858 3654.892 - 3680.098: 0.1359% ( 2) 00:07:35.858 3680.098 - 3705.305: 0.1472% ( 2) 00:07:35.858 3705.305 - 3730.511: 0.1642% ( 3) 00:07:35.858 3730.511 - 3755.717: 0.1755% ( 2) 00:07:35.858 3755.717 - 3780.923: 0.1868% ( 2) 00:07:35.858 3780.923 - 3806.129: 0.1981% ( 2) 00:07:35.858 3806.129 - 3831.335: 0.2095% ( 2) 00:07:35.858 3831.335 - 3856.542: 0.2208% ( 2) 00:07:35.858 3856.542 - 3881.748: 0.2378% ( 3) 00:07:35.858 3881.748 - 3906.954: 0.2491% ( 2) 00:07:35.858 3906.954 - 3932.160: 0.2604% ( 2) 00:07:35.858 3932.160 - 3957.366: 0.2717% ( 2) 00:07:35.858 3957.366 - 3982.572: 0.2831% ( 2) 00:07:35.858 3982.572 - 4007.778: 0.2944% ( 2) 00:07:35.858 4007.778 - 4032.985: 0.3057% ( 2) 00:07:35.858 4032.985 - 4058.191: 0.3170% ( 2) 00:07:35.859 4058.191 - 4083.397: 0.3284% ( 2) 00:07:35.859 4083.397 - 4108.603: 0.3453% ( 3) 00:07:35.859 4108.603 - 4133.809: 0.3567% ( 2) 00:07:35.859 4133.809 - 4159.015: 0.3623% ( 1) 00:07:35.859 4990.818 - 5016.025: 0.3680% ( 1) 00:07:35.859 5016.025 - 5041.231: 0.4076% ( 7) 00:07:35.859 5041.231 - 5066.437: 0.4133% ( 1) 00:07:35.859 5066.437 - 5091.643: 0.4189% ( 1) 00:07:35.859 5091.643 - 5116.849: 0.4303% ( 2) 00:07:35.859 5116.849 - 5142.055: 0.4472% ( 3) 00:07:35.859 5142.055 - 5167.262: 0.4586% ( 2) 00:07:35.859 5167.262 - 5192.468: 0.4699% ( 2) 00:07:35.859 5192.468 - 5217.674: 0.4812% ( 2) 00:07:35.859 5217.674 - 5242.880: 0.4982% ( 3) 00:07:35.859 5242.880 - 5268.086: 0.5095% ( 2) 00:07:35.859 5268.086 - 5293.292: 0.5208% ( 2) 00:07:35.859 5293.292 - 5318.498: 0.5322% ( 2) 00:07:35.859 5318.498 - 5343.705: 0.5491% ( 3) 00:07:35.859 5343.705 - 5368.911: 0.5605% ( 2) 00:07:35.859 5368.911 - 5394.117: 0.5718% ( 2) 00:07:35.859 5394.117 - 5419.323: 0.5831% ( 2) 00:07:35.859 5419.323 - 5444.529: 0.5944% ( 2) 00:07:35.859 5444.529 - 5469.735: 0.6114% ( 3) 00:07:35.859 5469.735 - 5494.942: 0.6227% ( 2) 00:07:35.859 5494.942 - 5520.148: 0.6341% ( 2) 00:07:35.859 5520.148 - 5545.354: 0.6454% ( 2) 00:07:35.859 5545.354 - 5570.560: 0.6624% ( 3) 00:07:35.859 5570.560 - 5595.766: 0.6737% ( 2) 00:07:35.859 5595.766 - 5620.972: 0.6850% ( 2) 00:07:35.859 5620.972 - 5646.178: 0.6963% ( 2) 00:07:35.859 5646.178 - 5671.385: 0.7077% ( 2) 00:07:35.859 5671.385 - 5696.591: 0.7190% ( 2) 00:07:35.859 5696.591 - 5721.797: 0.7246% ( 1) 00:07:35.859 6074.683 - 6099.889: 0.7926% ( 12) 00:07:35.859 6099.889 - 6125.095: 0.8832% ( 16) 00:07:35.859 6125.095 - 6150.302: 1.0587% ( 31) 00:07:35.859 6150.302 - 6175.508: 1.2964% ( 42) 00:07:35.859 6175.508 - 6200.714: 1.6418% ( 61) 00:07:35.859 6200.714 - 6225.920: 2.1343% ( 87) 00:07:35.859 6225.920 - 6251.126: 3.1024% ( 171) 00:07:35.859 6251.126 - 6276.332: 4.2572% ( 204) 00:07:35.859 6276.332 - 6301.538: 5.3499% ( 193) 00:07:35.859 6301.538 - 6326.745: 6.6689% ( 233) 00:07:35.859 6326.745 - 6351.951: 8.3390% ( 295) 00:07:35.859 6351.951 - 6377.157: 9.9921% ( 292) 00:07:35.859 6377.157 - 6402.363: 11.9169% ( 340) 00:07:35.859 6402.363 - 6427.569: 13.8417% ( 340) 00:07:35.859 6427.569 - 6452.775: 16.1515% ( 408) 00:07:35.859 6452.775 - 6503.188: 20.4823% ( 765) 00:07:35.859 6503.188 - 6553.600: 24.6716% ( 740) 00:07:35.859 6553.600 - 6604.012: 28.8270% ( 734) 00:07:35.859 6604.012 - 6654.425: 33.2597% ( 783) 00:07:35.859 6654.425 - 6704.837: 37.8284% ( 807) 00:07:35.859 6704.837 - 6755.249: 42.4026% ( 808) 00:07:35.859 6755.249 - 6805.662: 47.0109% ( 814) 00:07:35.859 6805.662 - 6856.074: 51.8512% ( 855) 00:07:35.859 6856.074 - 6906.486: 56.5161% ( 824) 00:07:35.859 6906.486 - 6956.898: 61.1979% ( 827) 00:07:35.859 6956.898 - 7007.311: 65.9024% ( 831) 00:07:35.859 7007.311 - 7057.723: 70.3918% ( 793) 00:07:35.859 7057.723 - 7108.135: 74.5754% ( 739) 00:07:35.859 7108.135 - 7158.548: 78.3231% ( 662) 00:07:35.859 7158.548 - 7208.960: 81.4085% ( 545) 00:07:35.859 7208.960 - 7259.372: 83.7749% ( 418) 00:07:35.859 7259.372 - 7309.785: 85.4789% ( 301) 00:07:35.859 7309.785 - 7360.197: 86.6791% ( 212) 00:07:35.859 7360.197 - 7410.609: 87.5000% ( 145) 00:07:35.859 7410.609 - 7461.022: 88.2133% ( 126) 00:07:35.859 7461.022 - 7511.434: 88.8191% ( 107) 00:07:35.859 7511.434 - 7561.846: 89.3229% ( 89) 00:07:35.859 7561.846 - 7612.258: 89.7305% ( 72) 00:07:35.859 7612.258 - 7662.671: 90.0928% ( 64) 00:07:35.859 7662.671 - 7713.083: 90.3872% ( 52) 00:07:35.859 7713.083 - 7763.495: 90.6024% ( 38) 00:07:35.859 7763.495 - 7813.908: 90.8175% ( 38) 00:07:35.859 7813.908 - 7864.320: 91.0100% ( 34) 00:07:35.859 7864.320 - 7914.732: 91.1798% ( 30) 00:07:35.859 7914.732 - 7965.145: 91.2987% ( 21) 00:07:35.859 7965.145 - 8015.557: 91.4176% ( 21) 00:07:35.859 8015.557 - 8065.969: 91.5082% ( 16) 00:07:35.859 8065.969 - 8116.382: 91.5817% ( 13) 00:07:35.859 8116.382 - 8166.794: 91.7120% ( 23) 00:07:35.859 8166.794 - 8217.206: 91.7856% ( 13) 00:07:35.859 8217.206 - 8267.618: 91.8308% ( 8) 00:07:35.859 8267.618 - 8318.031: 91.8875% ( 10) 00:07:35.859 8318.031 - 8368.443: 91.9327% ( 8) 00:07:35.859 8368.443 - 8418.855: 91.9611% ( 5) 00:07:35.859 8418.855 - 8469.268: 91.9950% ( 6) 00:07:35.859 8469.268 - 8519.680: 92.0177% ( 4) 00:07:35.859 8519.680 - 8570.092: 92.0516% ( 6) 00:07:35.859 8570.092 - 8620.505: 92.0799% ( 5) 00:07:35.859 8620.505 - 8670.917: 92.1082% ( 5) 00:07:35.859 8670.917 - 8721.329: 92.1649% ( 10) 00:07:35.859 8721.329 - 8771.742: 92.2158% ( 9) 00:07:35.859 8771.742 - 8822.154: 92.2951% ( 14) 00:07:35.859 8822.154 - 8872.566: 92.3404% ( 8) 00:07:35.859 8872.566 - 8922.978: 92.4196% ( 14) 00:07:35.859 8922.978 - 8973.391: 92.4875% ( 12) 00:07:35.859 8973.391 - 9023.803: 92.5611% ( 13) 00:07:35.859 9023.803 - 9074.215: 92.6178% ( 10) 00:07:35.859 9074.215 - 9124.628: 92.6687% ( 9) 00:07:35.859 9124.628 - 9175.040: 92.7480% ( 14) 00:07:35.859 9175.040 - 9225.452: 92.8272% ( 14) 00:07:35.859 9225.452 - 9275.865: 92.9121% ( 15) 00:07:35.859 9275.865 - 9326.277: 93.0027% ( 16) 00:07:35.859 9326.277 - 9376.689: 93.0820% ( 14) 00:07:35.859 9376.689 - 9427.102: 93.1726% ( 16) 00:07:35.859 9427.102 - 9477.514: 93.2801% ( 19) 00:07:35.859 9477.514 - 9527.926: 93.3877% ( 19) 00:07:35.859 9527.926 - 9578.338: 93.4726% ( 15) 00:07:35.859 9578.338 - 9628.751: 93.6028% ( 23) 00:07:35.859 9628.751 - 9679.163: 93.7160% ( 20) 00:07:35.859 9679.163 - 9729.575: 93.8179% ( 18) 00:07:35.859 9729.575 - 9779.988: 93.9425% ( 22) 00:07:35.859 9779.988 - 9830.400: 94.0444% ( 18) 00:07:35.859 9830.400 - 9880.812: 94.1633% ( 21) 00:07:35.859 9880.812 - 9931.225: 94.3671% ( 36) 00:07:35.859 9931.225 - 9981.637: 94.5369% ( 30) 00:07:35.859 9981.637 - 10032.049: 94.6898% ( 27) 00:07:35.859 10032.049 - 10082.462: 94.8143% ( 22) 00:07:35.859 10082.462 - 10132.874: 94.9558% ( 25) 00:07:35.859 10132.874 - 10183.286: 95.1030% ( 26) 00:07:35.859 10183.286 - 10233.698: 95.2276% ( 22) 00:07:35.859 10233.698 - 10284.111: 95.3635% ( 24) 00:07:35.859 10284.111 - 10334.523: 95.5050% ( 25) 00:07:35.859 10334.523 - 10384.935: 95.6012% ( 17) 00:07:35.859 10384.935 - 10435.348: 95.7031% ( 18) 00:07:35.859 10435.348 - 10485.760: 95.7767% ( 13) 00:07:35.859 10485.760 - 10536.172: 95.8503% ( 13) 00:07:35.859 10536.172 - 10586.585: 95.9296% ( 14) 00:07:35.859 10586.585 - 10636.997: 96.0032% ( 13) 00:07:35.859 10636.997 - 10687.409: 96.0768% ( 13) 00:07:35.859 10687.409 - 10737.822: 96.1447% ( 12) 00:07:35.859 10737.822 - 10788.234: 96.1957% ( 9) 00:07:35.859 10788.234 - 10838.646: 96.2296% ( 6) 00:07:35.859 10838.646 - 10889.058: 96.2579% ( 5) 00:07:35.859 10889.058 - 10939.471: 96.2806% ( 4) 00:07:35.859 10939.471 - 10989.883: 96.3145% ( 6) 00:07:35.859 10989.883 - 11040.295: 96.3542% ( 7) 00:07:35.859 11040.295 - 11090.708: 96.3881% ( 6) 00:07:35.859 11090.708 - 11141.120: 96.4391% ( 9) 00:07:35.859 11141.120 - 11191.532: 96.4844% ( 8) 00:07:35.859 11191.532 - 11241.945: 96.5297% ( 8) 00:07:35.859 11241.945 - 11292.357: 96.5636% ( 6) 00:07:35.859 11292.357 - 11342.769: 96.5976% ( 6) 00:07:35.859 11342.769 - 11393.182: 96.6259% ( 5) 00:07:35.859 11393.182 - 11443.594: 96.6655% ( 7) 00:07:35.859 11443.594 - 11494.006: 96.6938% ( 5) 00:07:35.859 11494.006 - 11544.418: 96.7335% ( 7) 00:07:35.859 11544.418 - 11594.831: 96.7618% ( 5) 00:07:35.859 11594.831 - 11645.243: 96.7957% ( 6) 00:07:35.859 11645.243 - 11695.655: 96.8240% ( 5) 00:07:35.859 11695.655 - 11746.068: 96.8637% ( 7) 00:07:35.859 11746.068 - 11796.480: 96.8920% ( 5) 00:07:35.859 11796.480 - 11846.892: 96.9429% ( 9) 00:07:35.859 11846.892 - 11897.305: 96.9995% ( 10) 00:07:35.859 11897.305 - 11947.717: 97.0335% ( 6) 00:07:35.859 11947.717 - 11998.129: 97.0788% ( 8) 00:07:35.859 11998.129 - 12048.542: 97.1354% ( 10) 00:07:35.859 12048.542 - 12098.954: 97.1807% ( 8) 00:07:35.859 12098.954 - 12149.366: 97.2317% ( 9) 00:07:35.859 12149.366 - 12199.778: 97.2769% ( 8) 00:07:35.859 12199.778 - 12250.191: 97.3279% ( 9) 00:07:35.859 12250.191 - 12300.603: 97.3788% ( 9) 00:07:35.859 12300.603 - 12351.015: 97.4524% ( 13) 00:07:35.859 12351.015 - 12401.428: 97.5091% ( 10) 00:07:35.859 12401.428 - 12451.840: 97.5713% ( 11) 00:07:35.859 12451.840 - 12502.252: 97.6223% ( 9) 00:07:35.859 12502.252 - 12552.665: 97.6732% ( 9) 00:07:35.859 12552.665 - 12603.077: 97.7242% ( 9) 00:07:35.859 12603.077 - 12653.489: 97.7751% ( 9) 00:07:35.859 12653.489 - 12703.902: 97.8317% ( 10) 00:07:35.859 12703.902 - 12754.314: 97.8884% ( 10) 00:07:35.859 12754.314 - 12804.726: 97.9563% ( 12) 00:07:35.859 12804.726 - 12855.138: 98.0186% ( 11) 00:07:35.859 12855.138 - 12905.551: 98.0582% ( 7) 00:07:35.859 12905.551 - 13006.375: 98.1488% ( 16) 00:07:35.859 13006.375 - 13107.200: 98.2111% ( 11) 00:07:35.859 13107.200 - 13208.025: 98.2733% ( 11) 00:07:35.859 13208.025 - 13308.849: 98.3865% ( 20) 00:07:35.859 13308.849 - 13409.674: 98.4715% ( 15) 00:07:35.859 13409.674 - 13510.498: 98.5507% ( 14) 00:07:35.859 13510.498 - 13611.323: 98.6187% ( 12) 00:07:35.859 13611.323 - 13712.148: 98.6979% ( 14) 00:07:35.859 13712.148 - 13812.972: 98.7715% ( 13) 00:07:35.859 13812.972 - 13913.797: 98.8508% ( 14) 00:07:35.859 13913.797 - 14014.622: 98.9187% ( 12) 00:07:35.859 14014.622 - 14115.446: 98.9980% ( 14) 00:07:35.859 14115.446 - 14216.271: 99.0716% ( 13) 00:07:35.859 14216.271 - 14317.095: 99.1055% ( 6) 00:07:35.859 14317.095 - 14417.920: 99.1395% ( 6) 00:07:35.859 14417.920 - 14518.745: 99.1791% ( 7) 00:07:35.859 14518.745 - 14619.569: 99.1961% ( 3) 00:07:35.859 14619.569 - 14720.394: 99.2131% ( 3) 00:07:35.860 14720.394 - 14821.218: 99.2357% ( 4) 00:07:35.860 14821.218 - 14922.043: 99.2527% ( 3) 00:07:35.860 14922.043 - 15022.868: 99.2697% ( 3) 00:07:35.860 15022.868 - 15123.692: 99.2754% ( 1) 00:07:35.860 20064.098 - 20164.923: 99.2867% ( 2) 00:07:35.860 20164.923 - 20265.748: 99.3263% ( 7) 00:07:35.860 20265.748 - 20366.572: 99.3320% ( 1) 00:07:35.860 20366.572 - 20467.397: 99.3546% ( 4) 00:07:35.860 20467.397 - 20568.222: 99.3773% ( 4) 00:07:35.860 20568.222 - 20669.046: 99.3942% ( 3) 00:07:35.860 20669.046 - 20769.871: 99.4169% ( 4) 00:07:35.860 20769.871 - 20870.695: 99.4339% ( 3) 00:07:35.860 20870.695 - 20971.520: 99.4565% ( 4) 00:07:35.860 20971.520 - 21072.345: 99.4735% ( 3) 00:07:35.860 21072.345 - 21173.169: 99.4962% ( 4) 00:07:35.860 21173.169 - 21273.994: 99.5131% ( 3) 00:07:35.860 21273.994 - 21374.818: 99.5358% ( 4) 00:07:35.860 21374.818 - 21475.643: 99.5528% ( 3) 00:07:35.860 21475.643 - 21576.468: 99.5697% ( 3) 00:07:35.860 21576.468 - 21677.292: 99.5924% ( 4) 00:07:35.860 21677.292 - 21778.117: 99.6094% ( 3) 00:07:35.860 21778.117 - 21878.942: 99.6320% ( 4) 00:07:35.860 21878.942 - 21979.766: 99.6377% ( 1) 00:07:35.860 27625.945 - 27827.594: 99.6490% ( 2) 00:07:35.860 27827.594 - 28029.243: 99.7169% ( 12) 00:07:35.860 28029.243 - 28230.892: 99.7792% ( 11) 00:07:35.860 28230.892 - 28432.542: 99.8075% ( 5) 00:07:35.860 28432.542 - 28634.191: 99.8755% ( 12) 00:07:35.860 28634.191 - 28835.840: 99.9377% ( 11) 00:07:35.860 28835.840 - 29037.489: 100.0000% ( 11) 00:07:35.860 00:07:35.860 04:57:55 nvme.nvme_perf -- nvme/nvme.sh@23 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_perf -q 128 -w write -o 12288 -t 1 -LL -i 0 00:07:37.235 Initializing NVMe Controllers 00:07:37.235 Attached to NVMe Controller at 0000:00:10.0 [1b36:0010] 00:07:37.235 Attached to NVMe Controller at 0000:00:11.0 [1b36:0010] 00:07:37.235 Attached to NVMe Controller at 0000:00:13.0 [1b36:0010] 00:07:37.235 Attached to NVMe Controller at 0000:00:12.0 [1b36:0010] 00:07:37.235 Associating PCIE (0000:00:10.0) NSID 1 with lcore 0 00:07:37.235 Associating PCIE (0000:00:11.0) NSID 1 with lcore 0 00:07:37.235 Associating PCIE (0000:00:13.0) NSID 1 with lcore 0 00:07:37.235 Associating PCIE (0000:00:12.0) NSID 1 with lcore 0 00:07:37.235 Associating PCIE (0000:00:12.0) NSID 2 with lcore 0 00:07:37.235 Associating PCIE (0000:00:12.0) NSID 3 with lcore 0 00:07:37.235 Initialization complete. Launching workers. 00:07:37.235 ======================================================== 00:07:37.235 Latency(us) 00:07:37.235 Device Information : IOPS MiB/s Average min max 00:07:37.235 PCIE (0000:00:10.0) NSID 1 from core 0: 16471.07 193.02 7773.90 5310.81 24914.04 00:07:37.235 PCIE (0000:00:11.0) NSID 1 from core 0: 16471.07 193.02 7765.67 5382.11 23782.74 00:07:37.235 PCIE (0000:00:13.0) NSID 1 from core 0: 16471.07 193.02 7758.26 4945.27 23644.50 00:07:37.235 PCIE (0000:00:12.0) NSID 1 from core 0: 16471.07 193.02 7750.73 4554.69 23250.58 00:07:37.235 PCIE (0000:00:12.0) NSID 2 from core 0: 16471.07 193.02 7743.12 4368.65 22584.00 00:07:37.235 PCIE (0000:00:12.0) NSID 3 from core 0: 16471.07 193.02 7735.64 4125.73 21636.17 00:07:37.235 ======================================================== 00:07:37.235 Total : 98826.42 1158.12 7754.56 4125.73 24914.04 00:07:37.235 00:07:37.235 Summary latency data for PCIE (0000:00:10.0) NSID 1 from core 0: 00:07:37.235 ================================================================================= 00:07:37.235 1.00000% : 6503.188us 00:07:37.235 10.00000% : 6856.074us 00:07:37.235 25.00000% : 7007.311us 00:07:37.235 50.00000% : 7259.372us 00:07:37.235 75.00000% : 7662.671us 00:07:37.235 90.00000% : 8973.391us 00:07:37.235 95.00000% : 11544.418us 00:07:37.235 98.00000% : 13812.972us 00:07:37.235 99.00000% : 17140.185us 00:07:37.235 99.50000% : 18854.203us 00:07:37.235 99.90000% : 24500.382us 00:07:37.235 99.99000% : 24903.680us 00:07:37.235 99.99900% : 25004.505us 00:07:37.235 99.99990% : 25004.505us 00:07:37.235 99.99999% : 25004.505us 00:07:37.235 00:07:37.235 Summary latency data for PCIE (0000:00:11.0) NSID 1 from core 0: 00:07:37.235 ================================================================================= 00:07:37.235 1.00000% : 6654.425us 00:07:37.235 10.00000% : 6906.486us 00:07:37.235 25.00000% : 7057.723us 00:07:37.235 50.00000% : 7208.960us 00:07:37.235 75.00000% : 7561.846us 00:07:37.235 90.00000% : 8973.391us 00:07:37.235 95.00000% : 11544.418us 00:07:37.235 98.00000% : 13812.972us 00:07:37.235 99.00000% : 17039.360us 00:07:37.235 99.50000% : 18652.554us 00:07:37.235 99.90000% : 23492.135us 00:07:37.235 99.99000% : 23794.609us 00:07:37.235 99.99900% : 23794.609us 00:07:37.235 99.99990% : 23794.609us 00:07:37.235 99.99999% : 23794.609us 00:07:37.235 00:07:37.235 Summary latency data for PCIE (0000:00:13.0) NSID 1 from core 0: 00:07:37.235 ================================================================================= 00:07:37.235 1.00000% : 6553.600us 00:07:37.235 10.00000% : 6906.486us 00:07:37.235 25.00000% : 7057.723us 00:07:37.235 50.00000% : 7259.372us 00:07:37.235 75.00000% : 7561.846us 00:07:37.235 90.00000% : 9175.040us 00:07:37.235 95.00000% : 11443.594us 00:07:37.235 98.00000% : 14014.622us 00:07:37.235 99.00000% : 16837.711us 00:07:37.235 99.50000% : 17946.782us 00:07:37.235 99.90000% : 23290.486us 00:07:37.235 99.99000% : 23693.785us 00:07:37.235 99.99900% : 23693.785us 00:07:37.235 99.99990% : 23693.785us 00:07:37.235 99.99999% : 23693.785us 00:07:37.235 00:07:37.235 Summary latency data for PCIE (0000:00:12.0) NSID 1 from core 0: 00:07:37.235 ================================================================================= 00:07:37.235 1.00000% : 6553.600us 00:07:37.235 10.00000% : 6906.486us 00:07:37.235 25.00000% : 7057.723us 00:07:37.235 50.00000% : 7208.960us 00:07:37.235 75.00000% : 7561.846us 00:07:37.235 90.00000% : 9225.452us 00:07:37.235 95.00000% : 10989.883us 00:07:37.235 98.00000% : 14216.271us 00:07:37.235 99.00000% : 16736.886us 00:07:37.235 99.50000% : 17442.658us 00:07:37.235 99.90000% : 22887.188us 00:07:37.235 99.99000% : 23290.486us 00:07:37.235 99.99900% : 23290.486us 00:07:37.235 99.99990% : 23290.486us 00:07:37.235 99.99999% : 23290.486us 00:07:37.235 00:07:37.236 Summary latency data for PCIE (0000:00:12.0) NSID 2 from core 0: 00:07:37.236 ================================================================================= 00:07:37.236 1.00000% : 6553.600us 00:07:37.236 10.00000% : 6906.486us 00:07:37.236 25.00000% : 7057.723us 00:07:37.236 50.00000% : 7208.960us 00:07:37.236 75.00000% : 7561.846us 00:07:37.236 90.00000% : 9074.215us 00:07:37.236 95.00000% : 11090.708us 00:07:37.236 98.00000% : 14014.622us 00:07:37.236 99.00000% : 16535.237us 00:07:37.236 99.50000% : 17039.360us 00:07:37.236 99.90000% : 22181.415us 00:07:37.236 99.99000% : 22584.714us 00:07:37.236 99.99900% : 22584.714us 00:07:37.236 99.99990% : 22584.714us 00:07:37.236 99.99999% : 22584.714us 00:07:37.236 00:07:37.236 Summary latency data for PCIE (0000:00:12.0) NSID 3 from core 0: 00:07:37.236 ================================================================================= 00:07:37.236 1.00000% : 6553.600us 00:07:37.236 10.00000% : 6906.486us 00:07:37.236 25.00000% : 7057.723us 00:07:37.236 50.00000% : 7208.960us 00:07:37.236 75.00000% : 7561.846us 00:07:37.236 90.00000% : 8922.978us 00:07:37.236 95.00000% : 11393.182us 00:07:37.236 98.00000% : 14115.446us 00:07:37.236 99.00000% : 16535.237us 00:07:37.236 99.50000% : 17442.658us 00:07:37.236 99.90000% : 21273.994us 00:07:37.236 99.99000% : 21677.292us 00:07:37.236 99.99900% : 21677.292us 00:07:37.236 99.99990% : 21677.292us 00:07:37.236 99.99999% : 21677.292us 00:07:37.236 00:07:37.236 Latency histogram for PCIE (0000:00:10.0) NSID 1 from core 0: 00:07:37.236 ============================================================================== 00:07:37.236 Range in us Cumulative IO count 00:07:37.236 5293.292 - 5318.498: 0.0061% ( 1) 00:07:37.236 5318.498 - 5343.705: 0.0121% ( 1) 00:07:37.236 5343.705 - 5368.911: 0.0242% ( 2) 00:07:37.236 5368.911 - 5394.117: 0.0363% ( 2) 00:07:37.236 5394.117 - 5419.323: 0.0424% ( 1) 00:07:37.236 5419.323 - 5444.529: 0.0545% ( 2) 00:07:37.236 5444.529 - 5469.735: 0.0606% ( 1) 00:07:37.236 5469.735 - 5494.942: 0.0727% ( 2) 00:07:37.236 5494.942 - 5520.148: 0.0787% ( 1) 00:07:37.236 5520.148 - 5545.354: 0.0848% ( 1) 00:07:37.236 5545.354 - 5570.560: 0.1030% ( 3) 00:07:37.236 5570.560 - 5595.766: 0.1272% ( 4) 00:07:37.236 5595.766 - 5620.972: 0.1393% ( 2) 00:07:37.236 5620.972 - 5646.178: 0.1514% ( 2) 00:07:37.236 5646.178 - 5671.385: 0.1575% ( 1) 00:07:37.236 5671.385 - 5696.591: 0.1696% ( 2) 00:07:37.236 5696.591 - 5721.797: 0.1756% ( 1) 00:07:37.236 5721.797 - 5747.003: 0.1817% ( 1) 00:07:37.236 5747.003 - 5772.209: 0.1999% ( 3) 00:07:37.236 5772.209 - 5797.415: 0.2059% ( 1) 00:07:37.236 5797.415 - 5822.622: 0.2120% ( 1) 00:07:37.236 5822.622 - 5847.828: 0.2301% ( 3) 00:07:37.236 5847.828 - 5873.034: 0.3694% ( 23) 00:07:37.236 5873.034 - 5898.240: 0.3876% ( 3) 00:07:37.236 6175.508 - 6200.714: 0.4058% ( 3) 00:07:37.236 6200.714 - 6225.920: 0.4179% ( 2) 00:07:37.236 6225.920 - 6251.126: 0.4360% ( 3) 00:07:37.236 6251.126 - 6276.332: 0.4603% ( 4) 00:07:37.236 6276.332 - 6301.538: 0.4724% ( 2) 00:07:37.236 6301.538 - 6326.745: 0.5027% ( 5) 00:07:37.236 6326.745 - 6351.951: 0.5511% ( 8) 00:07:37.236 6351.951 - 6377.157: 0.6298% ( 13) 00:07:37.236 6377.157 - 6402.363: 0.7207% ( 15) 00:07:37.236 6402.363 - 6427.569: 0.8055% ( 14) 00:07:37.236 6427.569 - 6452.775: 0.9024% ( 16) 00:07:37.236 6452.775 - 6503.188: 1.3263% ( 70) 00:07:37.236 6503.188 - 6553.600: 1.9077% ( 96) 00:07:37.236 6553.600 - 6604.012: 2.5375% ( 104) 00:07:37.236 6604.012 - 6654.425: 3.8578% ( 218) 00:07:37.236 6654.425 - 6704.837: 5.1054% ( 206) 00:07:37.236 6704.837 - 6755.249: 6.9586% ( 306) 00:07:37.236 6755.249 - 6805.662: 9.5870% ( 434) 00:07:37.236 6805.662 - 6856.074: 13.5538% ( 655) 00:07:37.236 6856.074 - 6906.486: 18.3200% ( 787) 00:07:37.236 6906.486 - 6956.898: 23.6495% ( 880) 00:07:37.236 6956.898 - 7007.311: 29.4210% ( 953) 00:07:37.236 7007.311 - 7057.723: 35.2471% ( 962) 00:07:37.236 7057.723 - 7108.135: 40.3161% ( 837) 00:07:37.236 7108.135 - 7158.548: 44.6100% ( 709) 00:07:37.236 7158.548 - 7208.960: 49.1521% ( 750) 00:07:37.236 7208.960 - 7259.372: 52.9978% ( 635) 00:07:37.236 7259.372 - 7309.785: 56.4983% ( 578) 00:07:37.236 7309.785 - 7360.197: 59.7868% ( 543) 00:07:37.236 7360.197 - 7410.609: 63.2328% ( 569) 00:07:37.236 7410.609 - 7461.022: 66.1761% ( 486) 00:07:37.236 7461.022 - 7511.434: 68.5078% ( 385) 00:07:37.236 7511.434 - 7561.846: 71.1180% ( 431) 00:07:37.236 7561.846 - 7612.258: 73.2134% ( 346) 00:07:37.236 7612.258 - 7662.671: 75.0848% ( 309) 00:07:37.236 7662.671 - 7713.083: 76.6352% ( 256) 00:07:37.236 7713.083 - 7763.495: 78.1916% ( 257) 00:07:37.236 7763.495 - 7813.908: 79.5119% ( 218) 00:07:37.236 7813.908 - 7864.320: 80.3537% ( 139) 00:07:37.236 7864.320 - 7914.732: 81.1168% ( 126) 00:07:37.236 7914.732 - 7965.145: 81.7951% ( 112) 00:07:37.236 7965.145 - 8015.557: 82.3946% ( 99) 00:07:37.236 8015.557 - 8065.969: 82.9215% ( 87) 00:07:37.236 8065.969 - 8116.382: 83.5816% ( 109) 00:07:37.236 8116.382 - 8166.794: 84.0359% ( 75) 00:07:37.236 8166.794 - 8217.206: 84.5203% ( 80) 00:07:37.236 8217.206 - 8267.618: 84.8474% ( 54) 00:07:37.236 8267.618 - 8318.031: 85.3440% ( 82) 00:07:37.236 8318.031 - 8368.443: 86.0405% ( 115) 00:07:37.236 8368.443 - 8418.855: 86.5492% ( 84) 00:07:37.236 8418.855 - 8469.268: 86.8702% ( 53) 00:07:37.236 8469.268 - 8519.680: 87.2335% ( 60) 00:07:37.236 8519.680 - 8570.092: 87.6332% ( 66) 00:07:37.236 8570.092 - 8620.505: 88.0632% ( 71) 00:07:37.236 8620.505 - 8670.917: 88.4084% ( 57) 00:07:37.236 8670.917 - 8721.329: 88.7960% ( 64) 00:07:37.236 8721.329 - 8771.742: 89.1412% ( 57) 00:07:37.236 8771.742 - 8822.154: 89.3774% ( 39) 00:07:37.236 8822.154 - 8872.566: 89.6136% ( 39) 00:07:37.236 8872.566 - 8922.978: 89.8498% ( 39) 00:07:37.236 8922.978 - 8973.391: 90.0739% ( 37) 00:07:37.236 8973.391 - 9023.803: 90.2798% ( 34) 00:07:37.236 9023.803 - 9074.215: 90.4736% ( 32) 00:07:37.236 9074.215 - 9124.628: 90.6371% ( 27) 00:07:37.236 9124.628 - 9175.040: 90.8188% ( 30) 00:07:37.236 9175.040 - 9225.452: 90.9460% ( 21) 00:07:37.236 9225.452 - 9275.865: 91.0671% ( 20) 00:07:37.236 9275.865 - 9326.277: 91.1761% ( 18) 00:07:37.236 9326.277 - 9376.689: 91.3154% ( 23) 00:07:37.236 9376.689 - 9427.102: 91.4850% ( 28) 00:07:37.236 9427.102 - 9477.514: 91.6061% ( 20) 00:07:37.236 9477.514 - 9527.926: 91.7938% ( 31) 00:07:37.236 9527.926 - 9578.338: 91.9695% ( 29) 00:07:37.236 9578.338 - 9628.751: 92.0664% ( 16) 00:07:37.236 9628.751 - 9679.163: 92.1754% ( 18) 00:07:37.236 9679.163 - 9729.575: 92.2359% ( 10) 00:07:37.236 9729.575 - 9779.988: 92.2905% ( 9) 00:07:37.236 9779.988 - 9830.400: 92.3934% ( 17) 00:07:37.236 9830.400 - 9880.812: 92.5206% ( 21) 00:07:37.236 9880.812 - 9931.225: 92.6175% ( 16) 00:07:37.236 9931.225 - 9981.637: 92.6478% ( 5) 00:07:37.236 9981.637 - 10032.049: 92.7204% ( 12) 00:07:37.236 10032.049 - 10082.462: 92.9082% ( 31) 00:07:37.236 10082.462 - 10132.874: 92.9930% ( 14) 00:07:37.236 10132.874 - 10183.286: 93.0535% ( 10) 00:07:37.236 10183.286 - 10233.698: 93.1020% ( 8) 00:07:37.236 10233.698 - 10284.111: 93.1565% ( 9) 00:07:37.236 10284.111 - 10334.523: 93.2110% ( 9) 00:07:37.236 10334.523 - 10384.935: 93.2352% ( 4) 00:07:37.236 10384.935 - 10435.348: 93.2594% ( 4) 00:07:37.236 10435.348 - 10485.760: 93.2776% ( 3) 00:07:37.236 10485.760 - 10536.172: 93.3018% ( 4) 00:07:37.236 10536.172 - 10586.585: 93.3200% ( 3) 00:07:37.236 10586.585 - 10636.997: 93.4593% ( 23) 00:07:37.236 10636.997 - 10687.409: 93.5320% ( 12) 00:07:37.236 10687.409 - 10737.822: 93.5623% ( 5) 00:07:37.236 10737.822 - 10788.234: 93.5744% ( 2) 00:07:37.236 10788.234 - 10838.646: 93.5986% ( 4) 00:07:37.236 10838.646 - 10889.058: 93.6349% ( 6) 00:07:37.236 10889.058 - 10939.471: 93.6713% ( 6) 00:07:37.236 10939.471 - 10989.883: 93.7197% ( 8) 00:07:37.236 10989.883 - 11040.295: 93.8651% ( 24) 00:07:37.236 11040.295 - 11090.708: 94.1073% ( 40) 00:07:37.236 11090.708 - 11141.120: 94.2224% ( 19) 00:07:37.236 11141.120 - 11191.532: 94.3314% ( 18) 00:07:37.236 11191.532 - 11241.945: 94.4222% ( 15) 00:07:37.236 11241.945 - 11292.357: 94.5010% ( 13) 00:07:37.236 11292.357 - 11342.769: 94.5918% ( 15) 00:07:37.236 11342.769 - 11393.182: 94.7008% ( 18) 00:07:37.236 11393.182 - 11443.594: 94.8038% ( 17) 00:07:37.236 11443.594 - 11494.006: 94.9067% ( 17) 00:07:37.236 11494.006 - 11544.418: 95.0218% ( 19) 00:07:37.236 11544.418 - 11594.831: 95.1369% ( 19) 00:07:37.236 11594.831 - 11645.243: 95.3488% ( 35) 00:07:37.236 11645.243 - 11695.655: 95.4760% ( 21) 00:07:37.236 11695.655 - 11746.068: 95.5729% ( 16) 00:07:37.236 11746.068 - 11796.480: 95.6940% ( 20) 00:07:37.236 11796.480 - 11846.892: 95.7607% ( 11) 00:07:37.236 11846.892 - 11897.305: 95.8394% ( 13) 00:07:37.236 11897.305 - 11947.717: 95.9121% ( 12) 00:07:37.236 11947.717 - 11998.129: 96.0090% ( 16) 00:07:37.236 11998.129 - 12048.542: 96.1240% ( 19) 00:07:37.236 12048.542 - 12098.954: 96.2270% ( 17) 00:07:37.236 12098.954 - 12149.366: 96.3481% ( 20) 00:07:37.236 12149.366 - 12199.778: 96.4692% ( 20) 00:07:37.236 12199.778 - 12250.191: 96.5540% ( 14) 00:07:37.236 12250.191 - 12300.603: 96.6267% ( 12) 00:07:37.236 12300.603 - 12351.015: 96.7175% ( 15) 00:07:37.236 12351.015 - 12401.428: 96.8266% ( 18) 00:07:37.236 12401.428 - 12451.840: 96.8871% ( 10) 00:07:37.236 12451.840 - 12502.252: 96.9477% ( 10) 00:07:37.236 12502.252 - 12552.665: 97.0082% ( 10) 00:07:37.236 12552.665 - 12603.077: 97.0446% ( 6) 00:07:37.237 12603.077 - 12653.489: 97.0749% ( 5) 00:07:37.237 12653.489 - 12703.902: 97.1112% ( 6) 00:07:37.237 12703.902 - 12754.314: 97.1536% ( 7) 00:07:37.237 12754.314 - 12804.726: 97.1899% ( 6) 00:07:37.237 12804.726 - 12855.138: 97.2323% ( 7) 00:07:37.237 12855.138 - 12905.551: 97.2868% ( 9) 00:07:37.237 12905.551 - 13006.375: 97.3656% ( 13) 00:07:37.237 13006.375 - 13107.200: 97.4261% ( 10) 00:07:37.237 13107.200 - 13208.025: 97.5048% ( 13) 00:07:37.237 13208.025 - 13308.849: 97.6260% ( 20) 00:07:37.237 13308.849 - 13409.674: 97.7350% ( 18) 00:07:37.237 13409.674 - 13510.498: 97.8379% ( 17) 00:07:37.237 13510.498 - 13611.323: 97.8924% ( 9) 00:07:37.237 13611.323 - 13712.148: 97.9833% ( 15) 00:07:37.237 13712.148 - 13812.972: 98.0378% ( 9) 00:07:37.237 13812.972 - 13913.797: 98.0862% ( 8) 00:07:37.237 13913.797 - 14014.622: 98.1226% ( 6) 00:07:37.237 14014.622 - 14115.446: 98.1831% ( 10) 00:07:37.237 14115.446 - 14216.271: 98.2255% ( 7) 00:07:37.237 14216.271 - 14317.095: 98.2679% ( 7) 00:07:37.237 14317.095 - 14417.920: 98.2922% ( 4) 00:07:37.237 14417.920 - 14518.745: 98.3224% ( 5) 00:07:37.237 14518.745 - 14619.569: 98.3951% ( 12) 00:07:37.237 14619.569 - 14720.394: 98.4133% ( 3) 00:07:37.237 14720.394 - 14821.218: 98.4314% ( 3) 00:07:37.237 14821.218 - 14922.043: 98.4496% ( 3) 00:07:37.237 15426.166 - 15526.991: 98.4557% ( 1) 00:07:37.237 15627.815 - 15728.640: 98.5102% ( 9) 00:07:37.237 15728.640 - 15829.465: 98.5465% ( 6) 00:07:37.237 15829.465 - 15930.289: 98.5768% ( 5) 00:07:37.237 15930.289 - 16031.114: 98.6616% ( 14) 00:07:37.237 16031.114 - 16131.938: 98.6919% ( 5) 00:07:37.237 16131.938 - 16232.763: 98.7100% ( 3) 00:07:37.237 16232.763 - 16333.588: 98.7161% ( 1) 00:07:37.237 16333.588 - 16434.412: 98.7343% ( 3) 00:07:37.237 16434.412 - 16535.237: 98.7645% ( 5) 00:07:37.237 16535.237 - 16636.062: 98.8493% ( 14) 00:07:37.237 16636.062 - 16736.886: 98.9341% ( 14) 00:07:37.237 16736.886 - 16837.711: 98.9583% ( 4) 00:07:37.237 16837.711 - 16938.535: 98.9826% ( 4) 00:07:37.237 16938.535 - 17039.360: 98.9947% ( 2) 00:07:37.237 17039.360 - 17140.185: 99.0250% ( 5) 00:07:37.237 17140.185 - 17241.009: 99.0492% ( 4) 00:07:37.237 17241.009 - 17341.834: 99.0673% ( 3) 00:07:37.237 17341.834 - 17442.658: 99.0855% ( 3) 00:07:37.237 17442.658 - 17543.483: 99.0976% ( 2) 00:07:37.237 17543.483 - 17644.308: 99.1158% ( 3) 00:07:37.237 17644.308 - 17745.132: 99.1219% ( 1) 00:07:37.237 17745.132 - 17845.957: 99.1400% ( 3) 00:07:37.237 17845.957 - 17946.782: 99.1461% ( 1) 00:07:37.237 18047.606 - 18148.431: 99.1521% ( 1) 00:07:37.237 18148.431 - 18249.255: 99.1945% ( 7) 00:07:37.237 18249.255 - 18350.080: 99.3156% ( 20) 00:07:37.237 18350.080 - 18450.905: 99.3459% ( 5) 00:07:37.237 18450.905 - 18551.729: 99.3944% ( 8) 00:07:37.237 18551.729 - 18652.554: 99.4610% ( 11) 00:07:37.237 18652.554 - 18753.378: 99.4973% ( 6) 00:07:37.237 18753.378 - 18854.203: 99.5155% ( 3) 00:07:37.237 18854.203 - 18955.028: 99.5337% ( 3) 00:07:37.237 18955.028 - 19055.852: 99.5518% ( 3) 00:07:37.237 19055.852 - 19156.677: 99.6063% ( 9) 00:07:37.237 19156.677 - 19257.502: 99.6124% ( 1) 00:07:37.237 23189.662 - 23290.486: 99.6427% ( 5) 00:07:37.237 23290.486 - 23391.311: 99.6548% ( 2) 00:07:37.237 23391.311 - 23492.135: 99.6790% ( 4) 00:07:37.237 23492.135 - 23592.960: 99.6972% ( 3) 00:07:37.237 23592.960 - 23693.785: 99.7214% ( 4) 00:07:37.237 23693.785 - 23794.609: 99.7456% ( 4) 00:07:37.237 23794.609 - 23895.434: 99.7638% ( 3) 00:07:37.237 23895.434 - 23996.258: 99.7941% ( 5) 00:07:37.237 23996.258 - 24097.083: 99.8123% ( 3) 00:07:37.237 24097.083 - 24197.908: 99.8365% ( 4) 00:07:37.237 24197.908 - 24298.732: 99.8607% ( 4) 00:07:37.237 24298.732 - 24399.557: 99.8849% ( 4) 00:07:37.237 24399.557 - 24500.382: 99.9092% ( 4) 00:07:37.237 24500.382 - 24601.206: 99.9334% ( 4) 00:07:37.237 24601.206 - 24702.031: 99.9516% ( 3) 00:07:37.237 24702.031 - 24802.855: 99.9758% ( 4) 00:07:37.237 24802.855 - 24903.680: 99.9939% ( 3) 00:07:37.237 24903.680 - 25004.505: 100.0000% ( 1) 00:07:37.237 00:07:37.237 Latency histogram for PCIE (0000:00:11.0) NSID 1 from core 0: 00:07:37.237 ============================================================================== 00:07:37.237 Range in us Cumulative IO count 00:07:37.237 5368.911 - 5394.117: 0.0121% ( 2) 00:07:37.237 5394.117 - 5419.323: 0.0908% ( 13) 00:07:37.237 5419.323 - 5444.529: 0.1211% ( 5) 00:07:37.237 5444.529 - 5469.735: 0.1817% ( 10) 00:07:37.237 5469.735 - 5494.942: 0.2846% ( 17) 00:07:37.237 5494.942 - 5520.148: 0.3028% ( 3) 00:07:37.237 5520.148 - 5545.354: 0.3270% ( 4) 00:07:37.237 5545.354 - 5570.560: 0.3391% ( 2) 00:07:37.237 5570.560 - 5595.766: 0.3513% ( 2) 00:07:37.237 5595.766 - 5620.972: 0.3634% ( 2) 00:07:37.237 5620.972 - 5646.178: 0.3755% ( 2) 00:07:37.237 5646.178 - 5671.385: 0.3876% ( 2) 00:07:37.237 6351.951 - 6377.157: 0.3937% ( 1) 00:07:37.237 6377.157 - 6402.363: 0.3997% ( 1) 00:07:37.237 6402.363 - 6427.569: 0.4239% ( 4) 00:07:37.237 6427.569 - 6452.775: 0.4421% ( 3) 00:07:37.237 6452.775 - 6503.188: 0.4784% ( 6) 00:07:37.237 6503.188 - 6553.600: 0.5451% ( 11) 00:07:37.237 6553.600 - 6604.012: 0.7025% ( 26) 00:07:37.237 6604.012 - 6654.425: 1.0962% ( 65) 00:07:37.237 6654.425 - 6704.837: 1.5746% ( 79) 00:07:37.237 6704.837 - 6755.249: 2.6708% ( 181) 00:07:37.237 6755.249 - 6805.662: 4.8268% ( 356) 00:07:37.237 6805.662 - 6856.074: 7.7822% ( 488) 00:07:37.237 6856.074 - 6906.486: 11.5734% ( 626) 00:07:37.237 6906.486 - 6956.898: 16.6909% ( 845) 00:07:37.237 6956.898 - 7007.311: 22.3050% ( 927) 00:07:37.237 7007.311 - 7057.723: 28.9608% ( 1099) 00:07:37.237 7057.723 - 7108.135: 36.2888% ( 1210) 00:07:37.237 7108.135 - 7158.548: 43.5925% ( 1206) 00:07:37.237 7158.548 - 7208.960: 50.5753% ( 1153) 00:07:37.237 7208.960 - 7259.372: 56.8011% ( 1028) 00:07:37.237 7259.372 - 7309.785: 62.3123% ( 910) 00:07:37.237 7309.785 - 7360.197: 66.1640% ( 636) 00:07:37.237 7360.197 - 7410.609: 69.2042% ( 502) 00:07:37.237 7410.609 - 7461.022: 71.4632% ( 373) 00:07:37.237 7461.022 - 7511.434: 73.3891% ( 318) 00:07:37.237 7511.434 - 7561.846: 75.2301% ( 304) 00:07:37.237 7561.846 - 7612.258: 76.4535% ( 202) 00:07:37.237 7612.258 - 7662.671: 77.2892% ( 138) 00:07:37.237 7662.671 - 7713.083: 78.4702% ( 195) 00:07:37.237 7713.083 - 7763.495: 79.0940% ( 103) 00:07:37.237 7763.495 - 7813.908: 79.7541% ( 109) 00:07:37.237 7813.908 - 7864.320: 80.3416% ( 97) 00:07:37.237 7864.320 - 7914.732: 81.0078% ( 110) 00:07:37.237 7914.732 - 7965.145: 81.4559% ( 74) 00:07:37.237 7965.145 - 8015.557: 82.1039% ( 107) 00:07:37.237 8015.557 - 8065.969: 82.8307% ( 120) 00:07:37.237 8065.969 - 8116.382: 83.2667% ( 72) 00:07:37.237 8116.382 - 8166.794: 83.8057% ( 89) 00:07:37.237 8166.794 - 8217.206: 84.4961% ( 114) 00:07:37.237 8217.206 - 8267.618: 85.2653% ( 127) 00:07:37.237 8267.618 - 8318.031: 85.9375% ( 111) 00:07:37.237 8318.031 - 8368.443: 86.3614% ( 70) 00:07:37.237 8368.443 - 8418.855: 86.7127% ( 58) 00:07:37.237 8418.855 - 8469.268: 86.9852% ( 45) 00:07:37.237 8469.268 - 8519.680: 87.3486% ( 60) 00:07:37.237 8519.680 - 8570.092: 87.6393% ( 48) 00:07:37.237 8570.092 - 8620.505: 87.8937% ( 42) 00:07:37.237 8620.505 - 8670.917: 88.2389% ( 57) 00:07:37.237 8670.917 - 8721.329: 88.5356% ( 49) 00:07:37.237 8721.329 - 8771.742: 88.8626% ( 54) 00:07:37.237 8771.742 - 8822.154: 89.2200% ( 59) 00:07:37.237 8822.154 - 8872.566: 89.5470% ( 54) 00:07:37.237 8872.566 - 8922.978: 89.8983% ( 58) 00:07:37.237 8922.978 - 8973.391: 90.2435% ( 57) 00:07:37.237 8973.391 - 9023.803: 90.4070% ( 27) 00:07:37.237 9023.803 - 9074.215: 90.5281% ( 20) 00:07:37.237 9074.215 - 9124.628: 90.7098% ( 30) 00:07:37.237 9124.628 - 9175.040: 90.8794% ( 28) 00:07:37.237 9175.040 - 9225.452: 91.0610% ( 30) 00:07:37.237 9225.452 - 9275.865: 91.3396% ( 46) 00:07:37.237 9275.865 - 9326.277: 91.5334% ( 32) 00:07:37.237 9326.277 - 9376.689: 91.8060% ( 45) 00:07:37.237 9376.689 - 9427.102: 91.9876% ( 30) 00:07:37.237 9427.102 - 9477.514: 92.1027% ( 19) 00:07:37.237 9477.514 - 9527.926: 92.2965% ( 32) 00:07:37.237 9527.926 - 9578.338: 92.4843% ( 31) 00:07:37.237 9578.338 - 9628.751: 92.6357% ( 25) 00:07:37.237 9628.751 - 9679.163: 92.7083% ( 12) 00:07:37.237 9679.163 - 9729.575: 92.7628% ( 9) 00:07:37.237 9729.575 - 9779.988: 92.8173% ( 9) 00:07:37.237 9779.988 - 9830.400: 92.8537% ( 6) 00:07:37.237 9830.400 - 9880.812: 92.8840% ( 5) 00:07:37.237 9880.812 - 9931.225: 92.9324% ( 8) 00:07:37.237 9931.225 - 9981.637: 92.9809% ( 8) 00:07:37.237 9981.637 - 10032.049: 93.0414% ( 10) 00:07:37.237 10032.049 - 10082.462: 93.1686% ( 21) 00:07:37.237 10082.462 - 10132.874: 93.2231% ( 9) 00:07:37.237 10132.874 - 10183.286: 93.3079% ( 14) 00:07:37.237 10183.286 - 10233.698: 93.3321% ( 4) 00:07:37.237 10233.698 - 10284.111: 93.3563% ( 4) 00:07:37.237 10284.111 - 10334.523: 93.3745% ( 3) 00:07:37.238 10334.523 - 10384.935: 93.4048% ( 5) 00:07:37.238 10384.935 - 10435.348: 93.4109% ( 1) 00:07:37.238 10788.234 - 10838.646: 93.4835% ( 12) 00:07:37.238 10838.646 - 10889.058: 93.5138% ( 5) 00:07:37.238 10889.058 - 10939.471: 93.5441% ( 5) 00:07:37.238 10939.471 - 10989.883: 93.5865% ( 7) 00:07:37.238 10989.883 - 11040.295: 93.6349% ( 8) 00:07:37.238 11040.295 - 11090.708: 93.6894% ( 9) 00:07:37.238 11090.708 - 11141.120: 93.7742% ( 14) 00:07:37.238 11141.120 - 11191.532: 93.9135% ( 23) 00:07:37.238 11191.532 - 11241.945: 94.0831% ( 28) 00:07:37.238 11241.945 - 11292.357: 94.2587% ( 29) 00:07:37.238 11292.357 - 11342.769: 94.4101% ( 25) 00:07:37.238 11342.769 - 11393.182: 94.5373% ( 21) 00:07:37.238 11393.182 - 11443.594: 94.7129% ( 29) 00:07:37.238 11443.594 - 11494.006: 94.9249% ( 35) 00:07:37.238 11494.006 - 11544.418: 95.1369% ( 35) 00:07:37.238 11544.418 - 11594.831: 95.4215% ( 47) 00:07:37.238 11594.831 - 11645.243: 95.6093% ( 31) 00:07:37.238 11645.243 - 11695.655: 95.7849% ( 29) 00:07:37.238 11695.655 - 11746.068: 95.8878% ( 17) 00:07:37.238 11746.068 - 11796.480: 95.9969% ( 18) 00:07:37.238 11796.480 - 11846.892: 96.0695% ( 12) 00:07:37.238 11846.892 - 11897.305: 96.1422% ( 12) 00:07:37.238 11897.305 - 11947.717: 96.1846% ( 7) 00:07:37.238 11947.717 - 11998.129: 96.2815% ( 16) 00:07:37.238 11998.129 - 12048.542: 96.3239% ( 7) 00:07:37.238 12048.542 - 12098.954: 96.4268% ( 17) 00:07:37.238 12098.954 - 12149.366: 96.5480% ( 20) 00:07:37.238 12149.366 - 12199.778: 96.6630% ( 19) 00:07:37.238 12199.778 - 12250.191: 96.8932% ( 38) 00:07:37.238 12250.191 - 12300.603: 96.9477% ( 9) 00:07:37.238 12300.603 - 12351.015: 96.9901% ( 7) 00:07:37.238 12351.015 - 12401.428: 97.0203% ( 5) 00:07:37.238 12401.428 - 12451.840: 97.0749% ( 9) 00:07:37.238 12451.840 - 12502.252: 97.1354% ( 10) 00:07:37.238 12502.252 - 12552.665: 97.1960% ( 10) 00:07:37.238 12552.665 - 12603.077: 97.2323% ( 6) 00:07:37.238 12603.077 - 12653.489: 97.2747% ( 7) 00:07:37.238 12653.489 - 12703.902: 97.3353% ( 10) 00:07:37.238 12703.902 - 12754.314: 97.4019% ( 11) 00:07:37.238 12754.314 - 12804.726: 97.4382% ( 6) 00:07:37.238 12804.726 - 12855.138: 97.4625% ( 4) 00:07:37.238 12855.138 - 12905.551: 97.4867% ( 4) 00:07:37.238 12905.551 - 13006.375: 97.5412% ( 9) 00:07:37.238 13006.375 - 13107.200: 97.5957% ( 9) 00:07:37.238 13107.200 - 13208.025: 97.6199% ( 4) 00:07:37.238 13208.025 - 13308.849: 97.6744% ( 9) 00:07:37.238 13308.849 - 13409.674: 97.7410% ( 11) 00:07:37.238 13409.674 - 13510.498: 97.8198% ( 13) 00:07:37.238 13510.498 - 13611.323: 97.9046% ( 14) 00:07:37.238 13611.323 - 13712.148: 97.9772% ( 12) 00:07:37.238 13712.148 - 13812.972: 98.0438% ( 11) 00:07:37.238 13812.972 - 13913.797: 98.1044% ( 10) 00:07:37.238 13913.797 - 14014.622: 98.2134% ( 18) 00:07:37.238 14014.622 - 14115.446: 98.3224% ( 18) 00:07:37.238 14115.446 - 14216.271: 98.4254% ( 17) 00:07:37.238 14216.271 - 14317.095: 98.4496% ( 4) 00:07:37.238 14821.218 - 14922.043: 98.4557% ( 1) 00:07:37.238 15627.815 - 15728.640: 98.4678% ( 2) 00:07:37.238 15728.640 - 15829.465: 98.4920% ( 4) 00:07:37.238 15829.465 - 15930.289: 98.5162% ( 4) 00:07:37.238 15930.289 - 16031.114: 98.5828% ( 11) 00:07:37.238 16031.114 - 16131.938: 98.7585% ( 29) 00:07:37.238 16131.938 - 16232.763: 98.7706% ( 2) 00:07:37.238 16232.763 - 16333.588: 98.7888% ( 3) 00:07:37.238 16333.588 - 16434.412: 98.8009% ( 2) 00:07:37.238 16434.412 - 16535.237: 98.8130% ( 2) 00:07:37.238 16535.237 - 16636.062: 98.8614% ( 8) 00:07:37.238 16636.062 - 16736.886: 98.8917% ( 5) 00:07:37.238 16736.886 - 16837.711: 98.9099% ( 3) 00:07:37.238 16837.711 - 16938.535: 98.9281% ( 3) 00:07:37.238 16938.535 - 17039.360: 99.1219% ( 32) 00:07:37.238 17039.360 - 17140.185: 99.1461% ( 4) 00:07:37.238 17140.185 - 17241.009: 99.1582% ( 2) 00:07:37.238 17241.009 - 17341.834: 99.1703% ( 2) 00:07:37.238 17341.834 - 17442.658: 99.1885% ( 3) 00:07:37.238 17442.658 - 17543.483: 99.2006% ( 2) 00:07:37.238 17543.483 - 17644.308: 99.2127% ( 2) 00:07:37.238 17644.308 - 17745.132: 99.2248% ( 2) 00:07:37.238 17946.782 - 18047.606: 99.2309% ( 1) 00:07:37.238 18047.606 - 18148.431: 99.2914% ( 10) 00:07:37.238 18148.431 - 18249.255: 99.3580% ( 11) 00:07:37.238 18249.255 - 18350.080: 99.4307% ( 12) 00:07:37.238 18350.080 - 18450.905: 99.4549% ( 4) 00:07:37.238 18450.905 - 18551.729: 99.4852% ( 5) 00:07:37.238 18551.729 - 18652.554: 99.5094% ( 4) 00:07:37.238 18652.554 - 18753.378: 99.5397% ( 5) 00:07:37.238 18753.378 - 18854.203: 99.5640% ( 4) 00:07:37.238 18854.203 - 18955.028: 99.5942% ( 5) 00:07:37.238 18955.028 - 19055.852: 99.6124% ( 3) 00:07:37.238 22383.065 - 22483.889: 99.6185% ( 1) 00:07:37.238 22483.889 - 22584.714: 99.6669% ( 8) 00:07:37.238 22584.714 - 22685.538: 99.7154% ( 8) 00:07:37.238 22685.538 - 22786.363: 99.7578% ( 7) 00:07:37.238 22786.363 - 22887.188: 99.7820% ( 4) 00:07:37.238 22887.188 - 22988.012: 99.8123% ( 5) 00:07:37.238 22988.012 - 23088.837: 99.8304% ( 3) 00:07:37.238 23088.837 - 23189.662: 99.8607% ( 5) 00:07:37.238 23189.662 - 23290.486: 99.8728% ( 2) 00:07:37.238 23290.486 - 23391.311: 99.8970% ( 4) 00:07:37.238 23391.311 - 23492.135: 99.9213% ( 4) 00:07:37.238 23492.135 - 23592.960: 99.9516% ( 5) 00:07:37.238 23592.960 - 23693.785: 99.9818% ( 5) 00:07:37.238 23693.785 - 23794.609: 100.0000% ( 3) 00:07:37.238 00:07:37.238 Latency histogram for PCIE (0000:00:13.0) NSID 1 from core 0: 00:07:37.238 ============================================================================== 00:07:37.238 Range in us Cumulative IO count 00:07:37.238 4940.406 - 4965.612: 0.0121% ( 2) 00:07:37.238 4965.612 - 4990.818: 0.0242% ( 2) 00:07:37.238 4990.818 - 5016.025: 0.0666% ( 7) 00:07:37.238 5016.025 - 5041.231: 0.0908% ( 4) 00:07:37.238 5041.231 - 5066.437: 0.1272% ( 6) 00:07:37.238 5066.437 - 5091.643: 0.2241% ( 16) 00:07:37.238 5091.643 - 5116.849: 0.2725% ( 8) 00:07:37.238 5116.849 - 5142.055: 0.2968% ( 4) 00:07:37.238 5142.055 - 5167.262: 0.3089% ( 2) 00:07:37.238 5167.262 - 5192.468: 0.3210% ( 2) 00:07:37.238 5192.468 - 5217.674: 0.3331% ( 2) 00:07:37.238 5217.674 - 5242.880: 0.3452% ( 2) 00:07:37.238 5242.880 - 5268.086: 0.3573% ( 2) 00:07:37.238 5268.086 - 5293.292: 0.3694% ( 2) 00:07:37.238 5293.292 - 5318.498: 0.3815% ( 2) 00:07:37.238 5318.498 - 5343.705: 0.3876% ( 1) 00:07:37.238 6150.302 - 6175.508: 0.3937% ( 1) 00:07:37.238 6251.126 - 6276.332: 0.3997% ( 1) 00:07:37.238 6276.332 - 6301.538: 0.4058% ( 1) 00:07:37.238 6301.538 - 6326.745: 0.4118% ( 1) 00:07:37.238 6326.745 - 6351.951: 0.4360% ( 4) 00:07:37.238 6351.951 - 6377.157: 0.4784% ( 7) 00:07:37.238 6377.157 - 6402.363: 0.5208% ( 7) 00:07:37.238 6402.363 - 6427.569: 0.5935% ( 12) 00:07:37.238 6427.569 - 6452.775: 0.6965% ( 17) 00:07:37.238 6452.775 - 6503.188: 0.9327% ( 39) 00:07:37.238 6503.188 - 6553.600: 1.1931% ( 43) 00:07:37.238 6553.600 - 6604.012: 1.5988% ( 67) 00:07:37.238 6604.012 - 6654.425: 2.3014% ( 116) 00:07:37.238 6654.425 - 6704.837: 3.4036% ( 182) 00:07:37.238 6704.837 - 6755.249: 4.7662% ( 225) 00:07:37.238 6755.249 - 6805.662: 6.8920% ( 351) 00:07:37.238 6805.662 - 6856.074: 9.4053% ( 415) 00:07:37.238 6856.074 - 6906.486: 12.8331% ( 566) 00:07:37.238 6906.486 - 6956.898: 17.0603% ( 698) 00:07:37.238 6956.898 - 7007.311: 22.2687% ( 860) 00:07:37.238 7007.311 - 7057.723: 29.1122% ( 1130) 00:07:37.238 7057.723 - 7108.135: 36.0162% ( 1140) 00:07:37.238 7108.135 - 7158.548: 42.9566% ( 1146) 00:07:37.238 7158.548 - 7208.960: 48.8130% ( 967) 00:07:37.238 7208.960 - 7259.372: 56.3953% ( 1252) 00:07:37.238 7259.372 - 7309.785: 62.0397% ( 932) 00:07:37.238 7309.785 - 7360.197: 65.6856% ( 602) 00:07:37.238 7360.197 - 7410.609: 69.2951% ( 596) 00:07:37.238 7410.609 - 7461.022: 72.1415% ( 470) 00:07:37.238 7461.022 - 7511.434: 74.1037% ( 324) 00:07:37.238 7511.434 - 7561.846: 76.1265% ( 334) 00:07:37.238 7561.846 - 7612.258: 77.7859% ( 274) 00:07:37.238 7612.258 - 7662.671: 78.8699% ( 179) 00:07:37.238 7662.671 - 7713.083: 79.6693% ( 132) 00:07:37.238 7713.083 - 7763.495: 80.2507% ( 96) 00:07:37.238 7763.495 - 7813.908: 80.8503% ( 99) 00:07:37.238 7813.908 - 7864.320: 81.3711% ( 86) 00:07:37.238 7864.320 - 7914.732: 81.8980% ( 87) 00:07:37.238 7914.732 - 7965.145: 82.2735% ( 62) 00:07:37.238 7965.145 - 8015.557: 82.8428% ( 94) 00:07:37.238 8015.557 - 8065.969: 83.2788% ( 72) 00:07:37.238 8065.969 - 8116.382: 83.7391% ( 76) 00:07:37.238 8116.382 - 8166.794: 84.1691% ( 71) 00:07:37.238 8166.794 - 8217.206: 84.9140% ( 123) 00:07:37.238 8217.206 - 8267.618: 85.4227% ( 84) 00:07:37.238 8267.618 - 8318.031: 85.7982% ( 62) 00:07:37.238 8318.031 - 8368.443: 86.0647% ( 44) 00:07:37.238 8368.443 - 8418.855: 86.5068% ( 73) 00:07:37.238 8418.855 - 8469.268: 86.7369% ( 38) 00:07:37.238 8469.268 - 8519.680: 87.0094% ( 45) 00:07:37.238 8519.680 - 8570.092: 87.2517% ( 40) 00:07:37.238 8570.092 - 8620.505: 87.4697% ( 36) 00:07:37.238 8620.505 - 8670.917: 87.7422% ( 45) 00:07:37.238 8670.917 - 8721.329: 87.9118% ( 28) 00:07:37.238 8721.329 - 8771.742: 88.0693% ( 26) 00:07:37.238 8771.742 - 8822.154: 88.3236% ( 42) 00:07:37.238 8822.154 - 8872.566: 88.4872% ( 27) 00:07:37.238 8872.566 - 8922.978: 88.9111% ( 70) 00:07:37.238 8922.978 - 8973.391: 89.0988% ( 31) 00:07:37.238 8973.391 - 9023.803: 89.2866% ( 31) 00:07:37.238 9023.803 - 9074.215: 89.5773% ( 48) 00:07:37.238 9074.215 - 9124.628: 89.7953% ( 36) 00:07:37.238 9124.628 - 9175.040: 90.2859% ( 81) 00:07:37.238 9175.040 - 9225.452: 90.6189% ( 55) 00:07:37.238 9225.452 - 9275.865: 90.9096% ( 48) 00:07:37.238 9275.865 - 9326.277: 91.1943% ( 47) 00:07:37.238 9326.277 - 9376.689: 91.4305% ( 39) 00:07:37.238 9376.689 - 9427.102: 91.6122% ( 30) 00:07:37.238 9427.102 - 9477.514: 91.7878% ( 29) 00:07:37.238 9477.514 - 9527.926: 91.8907% ( 17) 00:07:37.239 9527.926 - 9578.338: 92.0179% ( 21) 00:07:37.239 9578.338 - 9628.751: 92.2057% ( 31) 00:07:37.239 9628.751 - 9679.163: 92.3692% ( 27) 00:07:37.239 9679.163 - 9729.575: 92.4600% ( 15) 00:07:37.239 9729.575 - 9779.988: 92.5569% ( 16) 00:07:37.239 9779.988 - 9830.400: 92.6478% ( 15) 00:07:37.239 9830.400 - 9880.812: 92.7326% ( 14) 00:07:37.239 9880.812 - 9931.225: 92.8416% ( 18) 00:07:37.239 9931.225 - 9981.637: 93.0172% ( 29) 00:07:37.239 9981.637 - 10032.049: 93.1141% ( 16) 00:07:37.239 10032.049 - 10082.462: 93.1625% ( 8) 00:07:37.239 10082.462 - 10132.874: 93.2292% ( 11) 00:07:37.239 10132.874 - 10183.286: 93.3018% ( 12) 00:07:37.239 10183.286 - 10233.698: 93.3866% ( 14) 00:07:37.239 10233.698 - 10284.111: 93.4593% ( 12) 00:07:37.239 10284.111 - 10334.523: 93.5320% ( 12) 00:07:37.239 10334.523 - 10384.935: 93.6470% ( 19) 00:07:37.239 10384.935 - 10435.348: 93.7803% ( 22) 00:07:37.239 10435.348 - 10485.760: 93.8469% ( 11) 00:07:37.239 10485.760 - 10536.172: 93.9438% ( 16) 00:07:37.239 10536.172 - 10586.585: 94.0104% ( 11) 00:07:37.239 10586.585 - 10636.997: 94.0649% ( 9) 00:07:37.239 10636.997 - 10687.409: 94.1073% ( 7) 00:07:37.239 10687.409 - 10737.822: 94.1437% ( 6) 00:07:37.239 10737.822 - 10788.234: 94.1800% ( 6) 00:07:37.239 10788.234 - 10838.646: 94.2224% ( 7) 00:07:37.239 10838.646 - 10889.058: 94.2708% ( 8) 00:07:37.239 10889.058 - 10939.471: 94.3496% ( 13) 00:07:37.239 10939.471 - 10989.883: 94.4586% ( 18) 00:07:37.239 10989.883 - 11040.295: 94.5434% ( 14) 00:07:37.239 11040.295 - 11090.708: 94.6221% ( 13) 00:07:37.239 11090.708 - 11141.120: 94.6705% ( 8) 00:07:37.239 11141.120 - 11191.532: 94.7796% ( 18) 00:07:37.239 11191.532 - 11241.945: 94.8522% ( 12) 00:07:37.239 11241.945 - 11292.357: 94.8946% ( 7) 00:07:37.239 11292.357 - 11342.769: 94.9310% ( 6) 00:07:37.239 11342.769 - 11393.182: 94.9794% ( 8) 00:07:37.239 11393.182 - 11443.594: 95.0157% ( 6) 00:07:37.239 11443.594 - 11494.006: 95.0642% ( 8) 00:07:37.239 11494.006 - 11544.418: 95.1005% ( 6) 00:07:37.239 11544.418 - 11594.831: 95.1429% ( 7) 00:07:37.239 11594.831 - 11645.243: 95.1732% ( 5) 00:07:37.239 11645.243 - 11695.655: 95.2035% ( 5) 00:07:37.239 11695.655 - 11746.068: 95.2277% ( 4) 00:07:37.239 11746.068 - 11796.480: 95.2641% ( 6) 00:07:37.239 11796.480 - 11846.892: 95.3004% ( 6) 00:07:37.239 11846.892 - 11897.305: 95.3307% ( 5) 00:07:37.239 11897.305 - 11947.717: 95.3670% ( 6) 00:07:37.239 11947.717 - 11998.129: 95.4215% ( 9) 00:07:37.239 11998.129 - 12048.542: 95.4760% ( 9) 00:07:37.239 12048.542 - 12098.954: 95.5426% ( 11) 00:07:37.239 12098.954 - 12149.366: 95.7728% ( 38) 00:07:37.239 12149.366 - 12199.778: 95.8273% ( 9) 00:07:37.239 12199.778 - 12250.191: 95.9060% ( 13) 00:07:37.239 12250.191 - 12300.603: 96.0150% ( 18) 00:07:37.239 12300.603 - 12351.015: 96.1240% ( 18) 00:07:37.239 12351.015 - 12401.428: 96.4208% ( 49) 00:07:37.239 12401.428 - 12451.840: 96.6388% ( 36) 00:07:37.239 12451.840 - 12502.252: 96.8811% ( 40) 00:07:37.239 12502.252 - 12552.665: 97.0143% ( 22) 00:07:37.239 12552.665 - 12603.077: 97.0991% ( 14) 00:07:37.239 12603.077 - 12653.489: 97.1899% ( 15) 00:07:37.239 12653.489 - 12703.902: 97.2505% ( 10) 00:07:37.239 12703.902 - 12754.314: 97.2868% ( 6) 00:07:37.239 12754.314 - 12804.726: 97.3110% ( 4) 00:07:37.239 12804.726 - 12855.138: 97.3413% ( 5) 00:07:37.239 12855.138 - 12905.551: 97.3656% ( 4) 00:07:37.239 12905.551 - 13006.375: 97.4201% ( 9) 00:07:37.239 13006.375 - 13107.200: 97.4685% ( 8) 00:07:37.239 13107.200 - 13208.025: 97.4988% ( 5) 00:07:37.239 13208.025 - 13308.849: 97.5715% ( 12) 00:07:37.239 13308.849 - 13409.674: 97.6320% ( 10) 00:07:37.239 13409.674 - 13510.498: 97.7108% ( 13) 00:07:37.239 13510.498 - 13611.323: 97.7895% ( 13) 00:07:37.239 13611.323 - 13712.148: 97.8743% ( 14) 00:07:37.239 13712.148 - 13812.972: 97.9167% ( 7) 00:07:37.239 13812.972 - 13913.797: 97.9651% ( 8) 00:07:37.239 13913.797 - 14014.622: 98.0317% ( 11) 00:07:37.239 14014.622 - 14115.446: 98.0620% ( 5) 00:07:37.239 14115.446 - 14216.271: 98.1105% ( 8) 00:07:37.239 14216.271 - 14317.095: 98.1529% ( 7) 00:07:37.239 14317.095 - 14417.920: 98.1892% ( 6) 00:07:37.239 14417.920 - 14518.745: 98.2195% ( 5) 00:07:37.239 14518.745 - 14619.569: 98.2437% ( 4) 00:07:37.239 14619.569 - 14720.394: 98.2740% ( 5) 00:07:37.239 14720.394 - 14821.218: 98.3467% ( 12) 00:07:37.239 14821.218 - 14922.043: 98.4193% ( 12) 00:07:37.239 14922.043 - 15022.868: 98.4496% ( 5) 00:07:37.239 15123.692 - 15224.517: 98.4617% ( 2) 00:07:37.239 15224.517 - 15325.342: 98.4799% ( 3) 00:07:37.239 15325.342 - 15426.166: 98.5405% ( 10) 00:07:37.239 15426.166 - 15526.991: 98.6131% ( 12) 00:07:37.239 15526.991 - 15627.815: 98.6797% ( 11) 00:07:37.239 15627.815 - 15728.640: 98.6919% ( 2) 00:07:37.239 15728.640 - 15829.465: 98.7040% ( 2) 00:07:37.239 15829.465 - 15930.289: 98.7221% ( 3) 00:07:37.239 15930.289 - 16031.114: 98.7403% ( 3) 00:07:37.239 16031.114 - 16131.938: 98.7645% ( 4) 00:07:37.239 16131.938 - 16232.763: 98.7948% ( 5) 00:07:37.239 16232.763 - 16333.588: 98.8009% ( 1) 00:07:37.239 16333.588 - 16434.412: 98.8251% ( 4) 00:07:37.239 16434.412 - 16535.237: 98.8796% ( 9) 00:07:37.239 16535.237 - 16636.062: 98.9220% ( 7) 00:07:37.239 16636.062 - 16736.886: 98.9462% ( 4) 00:07:37.239 16736.886 - 16837.711: 99.1037% ( 26) 00:07:37.239 16837.711 - 16938.535: 99.1219% ( 3) 00:07:37.239 16938.535 - 17039.360: 99.1340% ( 2) 00:07:37.239 17039.360 - 17140.185: 99.1461% ( 2) 00:07:37.239 17140.185 - 17241.009: 99.1582% ( 2) 00:07:37.239 17241.009 - 17341.834: 99.1945% ( 6) 00:07:37.239 17341.834 - 17442.658: 99.2854% ( 15) 00:07:37.239 17442.658 - 17543.483: 99.3580% ( 12) 00:07:37.239 17543.483 - 17644.308: 99.4247% ( 11) 00:07:37.239 17644.308 - 17745.132: 99.4671% ( 7) 00:07:37.239 17745.132 - 17845.957: 99.4913% ( 4) 00:07:37.239 17845.957 - 17946.782: 99.5216% ( 5) 00:07:37.239 17946.782 - 18047.606: 99.5518% ( 5) 00:07:37.239 18047.606 - 18148.431: 99.5821% ( 5) 00:07:37.239 18148.431 - 18249.255: 99.6124% ( 5) 00:07:37.239 21979.766 - 22080.591: 99.6245% ( 2) 00:07:37.239 22080.591 - 22181.415: 99.6427% ( 3) 00:07:37.239 22181.415 - 22282.240: 99.6669% ( 4) 00:07:37.239 22282.240 - 22383.065: 99.6911% ( 4) 00:07:37.239 22383.065 - 22483.889: 99.7032% ( 2) 00:07:37.239 22483.889 - 22584.714: 99.7699% ( 11) 00:07:37.239 22584.714 - 22685.538: 99.7820% ( 2) 00:07:37.239 22685.538 - 22786.363: 99.8001% ( 3) 00:07:37.239 22786.363 - 22887.188: 99.8183% ( 3) 00:07:37.239 22887.188 - 22988.012: 99.8425% ( 4) 00:07:37.239 22988.012 - 23088.837: 99.8607% ( 3) 00:07:37.239 23088.837 - 23189.662: 99.8849% ( 4) 00:07:37.239 23189.662 - 23290.486: 99.9031% ( 3) 00:07:37.239 23290.486 - 23391.311: 99.9273% ( 4) 00:07:37.239 23391.311 - 23492.135: 99.9516% ( 4) 00:07:37.239 23492.135 - 23592.960: 99.9758% ( 4) 00:07:37.239 23592.960 - 23693.785: 100.0000% ( 4) 00:07:37.239 00:07:37.239 Latency histogram for PCIE (0000:00:12.0) NSID 1 from core 0: 00:07:37.239 ============================================================================== 00:07:37.239 Range in us Cumulative IO count 00:07:37.239 4537.108 - 4562.314: 0.0061% ( 1) 00:07:37.239 4562.314 - 4587.520: 0.0182% ( 2) 00:07:37.239 4587.520 - 4612.726: 0.0303% ( 2) 00:07:37.239 4612.726 - 4637.932: 0.0484% ( 3) 00:07:37.239 4637.932 - 4663.138: 0.0787% ( 5) 00:07:37.239 4663.138 - 4688.345: 0.1090% ( 5) 00:07:37.239 4688.345 - 4713.551: 0.1575% ( 8) 00:07:37.239 4713.551 - 4738.757: 0.2362% ( 13) 00:07:37.239 4738.757 - 4763.963: 0.2786% ( 7) 00:07:37.239 4763.963 - 4789.169: 0.3028% ( 4) 00:07:37.239 4789.169 - 4814.375: 0.3210% ( 3) 00:07:37.239 4814.375 - 4839.582: 0.3331% ( 2) 00:07:37.239 4839.582 - 4864.788: 0.3452% ( 2) 00:07:37.239 4864.788 - 4889.994: 0.3513% ( 1) 00:07:37.239 4889.994 - 4915.200: 0.3634% ( 2) 00:07:37.239 4915.200 - 4940.406: 0.3755% ( 2) 00:07:37.239 4940.406 - 4965.612: 0.3876% ( 2) 00:07:37.239 6099.889 - 6125.095: 0.3937% ( 1) 00:07:37.239 6125.095 - 6150.302: 0.3997% ( 1) 00:07:37.239 6150.302 - 6175.508: 0.4118% ( 2) 00:07:37.239 6175.508 - 6200.714: 0.4360% ( 4) 00:07:37.239 6200.714 - 6225.920: 0.4603% ( 4) 00:07:37.239 6225.920 - 6251.126: 0.4966% ( 6) 00:07:37.239 6251.126 - 6276.332: 0.5996% ( 17) 00:07:37.239 6276.332 - 6301.538: 0.6904% ( 15) 00:07:37.239 6301.538 - 6326.745: 0.7025% ( 2) 00:07:37.239 6326.745 - 6351.951: 0.7207% ( 3) 00:07:37.239 6351.951 - 6377.157: 0.7389% ( 3) 00:07:37.239 6377.157 - 6402.363: 0.7570% ( 3) 00:07:37.239 6402.363 - 6427.569: 0.7812% ( 4) 00:07:37.239 6427.569 - 6452.775: 0.8115% ( 5) 00:07:37.239 6452.775 - 6503.188: 0.8903% ( 13) 00:07:37.239 6503.188 - 6553.600: 1.0235% ( 22) 00:07:37.239 6553.600 - 6604.012: 1.3808% ( 59) 00:07:37.239 6604.012 - 6654.425: 2.0107% ( 104) 00:07:37.239 6654.425 - 6704.837: 2.7556% ( 123) 00:07:37.239 6704.837 - 6755.249: 4.0940% ( 221) 00:07:37.239 6755.249 - 6805.662: 6.1228% ( 335) 00:07:37.239 6805.662 - 6856.074: 8.8905% ( 457) 00:07:37.239 6856.074 - 6906.486: 12.3001% ( 563) 00:07:37.239 6906.486 - 6956.898: 16.4971% ( 693) 00:07:37.239 6956.898 - 7007.311: 22.4443% ( 982) 00:07:37.240 7007.311 - 7057.723: 29.9964% ( 1247) 00:07:37.240 7057.723 - 7108.135: 36.4886% ( 1072) 00:07:37.240 7108.135 - 7158.548: 43.5865% ( 1172) 00:07:37.240 7158.548 - 7208.960: 50.3634% ( 1119) 00:07:37.240 7208.960 - 7259.372: 57.0555% ( 1105) 00:07:37.240 7259.372 - 7309.785: 62.9239% ( 969) 00:07:37.240 7309.785 - 7360.197: 66.8120% ( 642) 00:07:37.240 7360.197 - 7410.609: 70.3246% ( 580) 00:07:37.240 7410.609 - 7461.022: 72.6926% ( 391) 00:07:37.240 7461.022 - 7511.434: 74.5276% ( 303) 00:07:37.240 7511.434 - 7561.846: 76.3748% ( 305) 00:07:37.240 7561.846 - 7612.258: 77.8585% ( 245) 00:07:37.240 7612.258 - 7662.671: 78.8760% ( 168) 00:07:37.240 7662.671 - 7713.083: 79.5724% ( 115) 00:07:37.240 7713.083 - 7763.495: 80.2507% ( 112) 00:07:37.240 7763.495 - 7813.908: 80.8866% ( 105) 00:07:37.240 7813.908 - 7864.320: 81.4680% ( 96) 00:07:37.240 7864.320 - 7914.732: 81.9222% ( 75) 00:07:37.240 7914.732 - 7965.145: 82.5036% ( 96) 00:07:37.240 7965.145 - 8015.557: 83.2909% ( 130) 00:07:37.240 8015.557 - 8065.969: 83.8663% ( 95) 00:07:37.240 8065.969 - 8116.382: 84.3447% ( 79) 00:07:37.240 8116.382 - 8166.794: 84.7081% ( 60) 00:07:37.240 8166.794 - 8217.206: 85.0472% ( 56) 00:07:37.240 8217.206 - 8267.618: 85.3622% ( 52) 00:07:37.240 8267.618 - 8318.031: 85.5984% ( 39) 00:07:37.240 8318.031 - 8368.443: 85.8830% ( 47) 00:07:37.240 8368.443 - 8418.855: 86.5552% ( 111) 00:07:37.240 8418.855 - 8469.268: 86.9973% ( 73) 00:07:37.240 8469.268 - 8519.680: 87.2456% ( 41) 00:07:37.240 8519.680 - 8570.092: 87.4939% ( 41) 00:07:37.240 8570.092 - 8620.505: 87.7180% ( 37) 00:07:37.240 8620.505 - 8670.917: 87.8997% ( 30) 00:07:37.240 8670.917 - 8721.329: 88.0511% ( 25) 00:07:37.240 8721.329 - 8771.742: 88.2449% ( 32) 00:07:37.240 8771.742 - 8822.154: 88.3660% ( 20) 00:07:37.240 8822.154 - 8872.566: 88.6446% ( 46) 00:07:37.240 8872.566 - 8922.978: 88.9414% ( 49) 00:07:37.240 8922.978 - 8973.391: 89.1352% ( 32) 00:07:37.240 8973.391 - 9023.803: 89.3047% ( 28) 00:07:37.240 9023.803 - 9074.215: 89.4016% ( 16) 00:07:37.240 9074.215 - 9124.628: 89.5349% ( 22) 00:07:37.240 9124.628 - 9175.040: 89.6681% ( 22) 00:07:37.240 9175.040 - 9225.452: 90.0497% ( 63) 00:07:37.240 9225.452 - 9275.865: 90.2556% ( 34) 00:07:37.240 9275.865 - 9326.277: 90.4191% ( 27) 00:07:37.240 9326.277 - 9376.689: 90.5463% ( 21) 00:07:37.240 9376.689 - 9427.102: 90.7582% ( 35) 00:07:37.240 9427.102 - 9477.514: 90.9157% ( 26) 00:07:37.240 9477.514 - 9527.926: 91.0913% ( 29) 00:07:37.240 9527.926 - 9578.338: 91.4184% ( 54) 00:07:37.240 9578.338 - 9628.751: 91.7515% ( 55) 00:07:37.240 9628.751 - 9679.163: 91.9210% ( 28) 00:07:37.240 9679.163 - 9729.575: 92.0785% ( 26) 00:07:37.240 9729.575 - 9779.988: 92.3631% ( 47) 00:07:37.240 9779.988 - 9830.400: 92.5024% ( 23) 00:07:37.240 9830.400 - 9880.812: 92.5993% ( 16) 00:07:37.240 9880.812 - 9931.225: 92.6720% ( 12) 00:07:37.240 9931.225 - 9981.637: 92.7689% ( 16) 00:07:37.240 9981.637 - 10032.049: 92.8416% ( 12) 00:07:37.240 10032.049 - 10082.462: 92.8900% ( 8) 00:07:37.240 10082.462 - 10132.874: 92.9688% ( 13) 00:07:37.240 10132.874 - 10183.286: 93.0778% ( 18) 00:07:37.240 10183.286 - 10233.698: 93.2231% ( 24) 00:07:37.240 10233.698 - 10284.111: 93.3927% ( 28) 00:07:37.240 10284.111 - 10334.523: 93.5501% ( 26) 00:07:37.240 10334.523 - 10384.935: 93.7197% ( 28) 00:07:37.240 10384.935 - 10435.348: 93.9377% ( 36) 00:07:37.240 10435.348 - 10485.760: 94.0710% ( 22) 00:07:37.240 10485.760 - 10536.172: 94.1921% ( 20) 00:07:37.240 10536.172 - 10586.585: 94.3314% ( 23) 00:07:37.240 10586.585 - 10636.997: 94.4586% ( 21) 00:07:37.240 10636.997 - 10687.409: 94.6827% ( 37) 00:07:37.240 10687.409 - 10737.822: 94.7735% ( 15) 00:07:37.240 10737.822 - 10788.234: 94.8401% ( 11) 00:07:37.240 10788.234 - 10838.646: 94.8704% ( 5) 00:07:37.240 10838.646 - 10889.058: 94.9188% ( 8) 00:07:37.240 10889.058 - 10939.471: 94.9673% ( 8) 00:07:37.240 10939.471 - 10989.883: 95.0097% ( 7) 00:07:37.240 10989.883 - 11040.295: 95.0521% ( 7) 00:07:37.240 11040.295 - 11090.708: 95.0945% ( 7) 00:07:37.240 11090.708 - 11141.120: 95.1187% ( 4) 00:07:37.240 11141.120 - 11191.532: 95.1429% ( 4) 00:07:37.240 11191.532 - 11241.945: 95.1672% ( 4) 00:07:37.240 11241.945 - 11292.357: 95.1853% ( 3) 00:07:37.240 11292.357 - 11342.769: 95.2035% ( 3) 00:07:37.240 11342.769 - 11393.182: 95.2217% ( 3) 00:07:37.240 11393.182 - 11443.594: 95.2398% ( 3) 00:07:37.240 11443.594 - 11494.006: 95.2580% ( 3) 00:07:37.240 11494.006 - 11544.418: 95.2822% ( 4) 00:07:37.240 11544.418 - 11594.831: 95.3004% ( 3) 00:07:37.240 11594.831 - 11645.243: 95.3186% ( 3) 00:07:37.240 11645.243 - 11695.655: 95.3367% ( 3) 00:07:37.240 11695.655 - 11746.068: 95.3488% ( 2) 00:07:37.240 11846.892 - 11897.305: 95.3731% ( 4) 00:07:37.240 11897.305 - 11947.717: 95.4215% ( 8) 00:07:37.240 11947.717 - 11998.129: 95.4639% ( 7) 00:07:37.240 11998.129 - 12048.542: 95.6395% ( 29) 00:07:37.240 12048.542 - 12098.954: 95.7425% ( 17) 00:07:37.240 12098.954 - 12149.366: 95.9363% ( 32) 00:07:37.240 12149.366 - 12199.778: 95.9666% ( 5) 00:07:37.240 12199.778 - 12250.191: 96.0271% ( 10) 00:07:37.240 12250.191 - 12300.603: 96.1059% ( 13) 00:07:37.240 12300.603 - 12351.015: 96.1967% ( 15) 00:07:37.240 12351.015 - 12401.428: 96.4087% ( 35) 00:07:37.240 12401.428 - 12451.840: 96.4995% ( 15) 00:07:37.240 12451.840 - 12502.252: 96.6812% ( 30) 00:07:37.240 12502.252 - 12552.665: 96.8205% ( 23) 00:07:37.240 12552.665 - 12603.077: 96.8689% ( 8) 00:07:37.240 12603.077 - 12653.489: 96.9295% ( 10) 00:07:37.240 12653.489 - 12703.902: 96.9780% ( 8) 00:07:37.240 12703.902 - 12754.314: 97.0022% ( 4) 00:07:37.240 12754.314 - 12804.726: 97.0143% ( 2) 00:07:37.240 12804.726 - 12855.138: 97.0203% ( 1) 00:07:37.240 12855.138 - 12905.551: 97.0385% ( 3) 00:07:37.240 12905.551 - 13006.375: 97.0627% ( 4) 00:07:37.240 13006.375 - 13107.200: 97.0870% ( 4) 00:07:37.240 13107.200 - 13208.025: 97.1778% ( 15) 00:07:37.240 13208.025 - 13308.849: 97.2989% ( 20) 00:07:37.240 13308.849 - 13409.674: 97.4625% ( 27) 00:07:37.240 13409.674 - 13510.498: 97.5594% ( 16) 00:07:37.240 13510.498 - 13611.323: 97.6441% ( 14) 00:07:37.240 13611.323 - 13712.148: 97.7168% ( 12) 00:07:37.240 13712.148 - 13812.972: 97.7955% ( 13) 00:07:37.240 13812.972 - 13913.797: 97.8682% ( 12) 00:07:37.240 13913.797 - 14014.622: 97.9348% ( 11) 00:07:37.240 14014.622 - 14115.446: 97.9954% ( 10) 00:07:37.240 14115.446 - 14216.271: 98.0378% ( 7) 00:07:37.240 14216.271 - 14317.095: 98.0681% ( 5) 00:07:37.240 14317.095 - 14417.920: 98.0802% ( 2) 00:07:37.240 14417.920 - 14518.745: 98.1226% ( 7) 00:07:37.240 14518.745 - 14619.569: 98.1589% ( 6) 00:07:37.240 14619.569 - 14720.394: 98.2619% ( 17) 00:07:37.240 14720.394 - 14821.218: 98.3164% ( 9) 00:07:37.240 14821.218 - 14922.043: 98.3588% ( 7) 00:07:37.240 14922.043 - 15022.868: 98.3951% ( 6) 00:07:37.240 15022.868 - 15123.692: 98.4436% ( 8) 00:07:37.240 15123.692 - 15224.517: 98.4981% ( 9) 00:07:37.240 15224.517 - 15325.342: 98.5405% ( 7) 00:07:37.240 15325.342 - 15426.166: 98.5889% ( 8) 00:07:37.240 15426.166 - 15526.991: 98.6313% ( 7) 00:07:37.240 15526.991 - 15627.815: 98.6919% ( 10) 00:07:37.240 15627.815 - 15728.640: 98.7585% ( 11) 00:07:37.240 15728.640 - 15829.465: 98.8251% ( 11) 00:07:37.240 15829.465 - 15930.289: 98.8433% ( 3) 00:07:37.240 16031.114 - 16131.938: 98.8493% ( 1) 00:07:37.240 16131.938 - 16232.763: 98.8735% ( 4) 00:07:37.240 16232.763 - 16333.588: 98.8917% ( 3) 00:07:37.240 16333.588 - 16434.412: 98.9159% ( 4) 00:07:37.240 16434.412 - 16535.237: 98.9341% ( 3) 00:07:37.240 16535.237 - 16636.062: 98.9826% ( 8) 00:07:37.240 16636.062 - 16736.886: 99.0976% ( 19) 00:07:37.240 16736.886 - 16837.711: 99.1945% ( 16) 00:07:37.240 16837.711 - 16938.535: 99.3096% ( 19) 00:07:37.240 16938.535 - 17039.360: 99.3641% ( 9) 00:07:37.240 17039.360 - 17140.185: 99.4065% ( 7) 00:07:37.240 17140.185 - 17241.009: 99.4489% ( 7) 00:07:37.240 17241.009 - 17341.834: 99.4852% ( 6) 00:07:37.240 17341.834 - 17442.658: 99.5337% ( 8) 00:07:37.240 17442.658 - 17543.483: 99.5761% ( 7) 00:07:37.240 17543.483 - 17644.308: 99.6063% ( 5) 00:07:37.240 17644.308 - 17745.132: 99.6124% ( 1) 00:07:37.240 21475.643 - 21576.468: 99.6185% ( 1) 00:07:37.241 21576.468 - 21677.292: 99.6427% ( 4) 00:07:37.241 21677.292 - 21778.117: 99.6609% ( 3) 00:07:37.241 21778.117 - 21878.942: 99.6790% ( 3) 00:07:37.241 21878.942 - 21979.766: 99.7032% ( 4) 00:07:37.241 21979.766 - 22080.591: 99.7517% ( 8) 00:07:37.241 22080.591 - 22181.415: 99.7759% ( 4) 00:07:37.241 22181.415 - 22282.240: 99.7820% ( 1) 00:07:37.241 22282.240 - 22383.065: 99.8001% ( 3) 00:07:37.241 22383.065 - 22483.889: 99.8304% ( 5) 00:07:37.241 22483.889 - 22584.714: 99.8607% ( 5) 00:07:37.241 22584.714 - 22685.538: 99.8728% ( 2) 00:07:37.241 22685.538 - 22786.363: 99.8970% ( 4) 00:07:37.241 22786.363 - 22887.188: 99.9213% ( 4) 00:07:37.241 22887.188 - 22988.012: 99.9394% ( 3) 00:07:37.241 22988.012 - 23088.837: 99.9576% ( 3) 00:07:37.241 23088.837 - 23189.662: 99.9818% ( 4) 00:07:37.241 23189.662 - 23290.486: 100.0000% ( 3) 00:07:37.241 00:07:37.241 Latency histogram for PCIE (0000:00:12.0) NSID 2 from core 0: 00:07:37.241 ============================================================================== 00:07:37.241 Range in us Cumulative IO count 00:07:37.241 4360.665 - 4385.871: 0.0242% ( 4) 00:07:37.241 4385.871 - 4411.077: 0.0606% ( 6) 00:07:37.241 4411.077 - 4436.283: 0.1817% ( 20) 00:07:37.241 4436.283 - 4461.489: 0.2483% ( 11) 00:07:37.241 4461.489 - 4486.695: 0.2907% ( 7) 00:07:37.241 4486.695 - 4511.902: 0.3028% ( 2) 00:07:37.241 4511.902 - 4537.108: 0.3149% ( 2) 00:07:37.241 4537.108 - 4562.314: 0.3270% ( 2) 00:07:37.241 4562.314 - 4587.520: 0.3391% ( 2) 00:07:37.241 4612.726 - 4637.932: 0.3513% ( 2) 00:07:37.241 4637.932 - 4663.138: 0.3573% ( 1) 00:07:37.241 4663.138 - 4688.345: 0.3755% ( 3) 00:07:37.241 4688.345 - 4713.551: 0.3876% ( 2) 00:07:37.241 5898.240 - 5923.446: 0.3937% ( 1) 00:07:37.241 5923.446 - 5948.652: 0.4058% ( 2) 00:07:37.241 5948.652 - 5973.858: 0.4239% ( 3) 00:07:37.241 5973.858 - 5999.065: 0.4542% ( 5) 00:07:37.241 5999.065 - 6024.271: 0.4845% ( 5) 00:07:37.241 6024.271 - 6049.477: 0.5148% ( 5) 00:07:37.241 6049.477 - 6074.683: 0.5451% ( 5) 00:07:37.241 6074.683 - 6099.889: 0.6117% ( 11) 00:07:37.241 6099.889 - 6125.095: 0.6480% ( 6) 00:07:37.241 6125.095 - 6150.302: 0.6662% ( 3) 00:07:37.241 6150.302 - 6175.508: 0.6965% ( 5) 00:07:37.241 6175.508 - 6200.714: 0.7146% ( 3) 00:07:37.241 6200.714 - 6225.920: 0.7207% ( 1) 00:07:37.241 6225.920 - 6251.126: 0.7328% ( 2) 00:07:37.241 6251.126 - 6276.332: 0.7389% ( 1) 00:07:37.241 6276.332 - 6301.538: 0.7510% ( 2) 00:07:37.241 6301.538 - 6326.745: 0.7691% ( 3) 00:07:37.241 6326.745 - 6351.951: 0.7812% ( 2) 00:07:37.241 6351.951 - 6377.157: 0.7873% ( 1) 00:07:37.241 6377.157 - 6402.363: 0.7934% ( 1) 00:07:37.241 6402.363 - 6427.569: 0.8055% ( 2) 00:07:37.241 6427.569 - 6452.775: 0.8358% ( 5) 00:07:37.241 6452.775 - 6503.188: 0.8842% ( 8) 00:07:37.241 6503.188 - 6553.600: 1.0053% ( 20) 00:07:37.241 6553.600 - 6604.012: 1.3021% ( 49) 00:07:37.241 6604.012 - 6654.425: 1.7684% ( 77) 00:07:37.241 6654.425 - 6704.837: 2.6163% ( 140) 00:07:37.241 6704.837 - 6755.249: 3.6458% ( 170) 00:07:37.241 6755.249 - 6805.662: 5.1659% ( 251) 00:07:37.241 6805.662 - 6856.074: 8.0547% ( 477) 00:07:37.241 6856.074 - 6906.486: 11.2403% ( 526) 00:07:37.241 6906.486 - 6956.898: 15.3101% ( 672) 00:07:37.241 6956.898 - 7007.311: 21.8508% ( 1080) 00:07:37.241 7007.311 - 7057.723: 29.9903% ( 1344) 00:07:37.241 7057.723 - 7108.135: 37.0700% ( 1169) 00:07:37.241 7108.135 - 7158.548: 44.7735% ( 1272) 00:07:37.241 7158.548 - 7208.960: 51.4717% ( 1106) 00:07:37.241 7208.960 - 7259.372: 57.9821% ( 1075) 00:07:37.241 7259.372 - 7309.785: 63.5235% ( 915) 00:07:37.241 7309.785 - 7360.197: 67.1391% ( 597) 00:07:37.241 7360.197 - 7410.609: 70.4336% ( 544) 00:07:37.241 7410.609 - 7461.022: 72.7531% ( 383) 00:07:37.241 7461.022 - 7511.434: 74.9152% ( 357) 00:07:37.241 7511.434 - 7561.846: 76.6655% ( 289) 00:07:37.241 7561.846 - 7612.258: 77.7435% ( 178) 00:07:37.241 7612.258 - 7662.671: 78.7185% ( 161) 00:07:37.241 7662.671 - 7713.083: 79.6996% ( 162) 00:07:37.241 7713.083 - 7763.495: 80.5354% ( 138) 00:07:37.241 7763.495 - 7813.908: 81.1834% ( 107) 00:07:37.241 7813.908 - 7864.320: 81.9767% ( 131) 00:07:37.241 7864.320 - 7914.732: 82.8125% ( 138) 00:07:37.241 7914.732 - 7965.145: 83.3212% ( 84) 00:07:37.241 7965.145 - 8015.557: 83.8542% ( 88) 00:07:37.241 8015.557 - 8065.969: 84.2418% ( 64) 00:07:37.241 8065.969 - 8116.382: 84.5930% ( 58) 00:07:37.241 8116.382 - 8166.794: 84.9806% ( 64) 00:07:37.241 8166.794 - 8217.206: 85.4954% ( 85) 00:07:37.241 8217.206 - 8267.618: 85.7740% ( 46) 00:07:37.241 8267.618 - 8318.031: 86.0162% ( 40) 00:07:37.241 8318.031 - 8368.443: 86.4947% ( 79) 00:07:37.241 8368.443 - 8418.855: 86.7733% ( 46) 00:07:37.241 8418.855 - 8469.268: 86.9671% ( 32) 00:07:37.241 8469.268 - 8519.680: 87.1487% ( 30) 00:07:37.241 8519.680 - 8570.092: 87.3789% ( 38) 00:07:37.241 8570.092 - 8620.505: 87.7301% ( 58) 00:07:37.241 8620.505 - 8670.917: 88.1117% ( 63) 00:07:37.241 8670.917 - 8721.329: 88.2873% ( 29) 00:07:37.241 8721.329 - 8771.742: 88.4327% ( 24) 00:07:37.241 8771.742 - 8822.154: 88.5901% ( 26) 00:07:37.241 8822.154 - 8872.566: 88.7657% ( 29) 00:07:37.241 8872.566 - 8922.978: 89.2078% ( 73) 00:07:37.241 8922.978 - 8973.391: 89.4198% ( 35) 00:07:37.241 8973.391 - 9023.803: 89.7529% ( 55) 00:07:37.241 9023.803 - 9074.215: 90.0073% ( 42) 00:07:37.241 9074.215 - 9124.628: 90.1223% ( 19) 00:07:37.241 9124.628 - 9175.040: 90.2192% ( 16) 00:07:37.241 9175.040 - 9225.452: 90.3101% ( 15) 00:07:37.241 9225.452 - 9275.865: 90.4191% ( 18) 00:07:37.241 9275.865 - 9326.277: 90.5099% ( 15) 00:07:37.241 9326.277 - 9376.689: 90.5947% ( 14) 00:07:37.241 9376.689 - 9427.102: 90.7037% ( 18) 00:07:37.241 9427.102 - 9477.514: 90.8430% ( 23) 00:07:37.241 9477.514 - 9527.926: 91.0671% ( 37) 00:07:37.241 9527.926 - 9578.338: 91.2730% ( 34) 00:07:37.241 9578.338 - 9628.751: 91.3820% ( 18) 00:07:37.241 9628.751 - 9679.163: 91.4729% ( 15) 00:07:37.241 9679.163 - 9729.575: 91.6061% ( 22) 00:07:37.241 9729.575 - 9779.988: 91.8060% ( 33) 00:07:37.241 9779.988 - 9830.400: 91.9392% ( 22) 00:07:37.241 9830.400 - 9880.812: 92.1088% ( 28) 00:07:37.241 9880.812 - 9931.225: 92.3328% ( 37) 00:07:37.241 9931.225 - 9981.637: 92.4479% ( 19) 00:07:37.241 9981.637 - 10032.049: 92.5933% ( 24) 00:07:37.241 10032.049 - 10082.462: 92.7083% ( 19) 00:07:37.241 10082.462 - 10132.874: 92.8416% ( 22) 00:07:37.241 10132.874 - 10183.286: 93.1565% ( 52) 00:07:37.241 10183.286 - 10233.698: 93.2837% ( 21) 00:07:37.241 10233.698 - 10284.111: 93.4472% ( 27) 00:07:37.241 10284.111 - 10334.523: 93.6592% ( 35) 00:07:37.241 10334.523 - 10384.935: 93.8408% ( 30) 00:07:37.241 10384.935 - 10435.348: 94.0104% ( 28) 00:07:37.241 10435.348 - 10485.760: 94.1315% ( 20) 00:07:37.241 10485.760 - 10536.172: 94.2103% ( 13) 00:07:37.241 10536.172 - 10586.585: 94.2708% ( 10) 00:07:37.241 10586.585 - 10636.997: 94.3253% ( 9) 00:07:37.241 10636.997 - 10687.409: 94.3677% ( 7) 00:07:37.241 10687.409 - 10737.822: 94.4101% ( 7) 00:07:37.241 10737.822 - 10788.234: 94.4767% ( 11) 00:07:37.241 10788.234 - 10838.646: 94.5615% ( 14) 00:07:37.241 10838.646 - 10889.058: 94.6403% ( 13) 00:07:37.241 10889.058 - 10939.471: 94.8219% ( 30) 00:07:37.241 10939.471 - 10989.883: 94.9067% ( 14) 00:07:37.241 10989.883 - 11040.295: 94.9612% ( 9) 00:07:37.241 11040.295 - 11090.708: 95.0036% ( 7) 00:07:37.241 11090.708 - 11141.120: 95.0581% ( 9) 00:07:37.241 11141.120 - 11191.532: 95.1126% ( 9) 00:07:37.241 11191.532 - 11241.945: 95.1672% ( 9) 00:07:37.241 11241.945 - 11292.357: 95.2217% ( 9) 00:07:37.241 11292.357 - 11342.769: 95.2519% ( 5) 00:07:37.241 11342.769 - 11393.182: 95.2822% ( 5) 00:07:37.241 11393.182 - 11443.594: 95.3307% ( 8) 00:07:37.241 11443.594 - 11494.006: 95.3791% ( 8) 00:07:37.241 11494.006 - 11544.418: 95.4336% ( 9) 00:07:37.241 11544.418 - 11594.831: 95.4457% ( 2) 00:07:37.241 11594.831 - 11645.243: 95.5124% ( 11) 00:07:37.241 11645.243 - 11695.655: 95.5669% ( 9) 00:07:37.241 11695.655 - 11746.068: 95.5790% ( 2) 00:07:37.241 11746.068 - 11796.480: 95.5911% ( 2) 00:07:37.241 11796.480 - 11846.892: 95.6032% ( 2) 00:07:37.241 11846.892 - 11897.305: 95.6093% ( 1) 00:07:37.241 11897.305 - 11947.717: 95.6395% ( 5) 00:07:37.241 11947.717 - 11998.129: 95.7485% ( 18) 00:07:37.241 11998.129 - 12048.542: 95.8697% ( 20) 00:07:37.241 12048.542 - 12098.954: 96.0271% ( 26) 00:07:37.241 12098.954 - 12149.366: 96.1240% ( 16) 00:07:37.241 12149.366 - 12199.778: 96.2088% ( 14) 00:07:37.241 12199.778 - 12250.191: 96.2875% ( 13) 00:07:37.241 12250.191 - 12300.603: 96.3844% ( 16) 00:07:37.241 12300.603 - 12351.015: 96.4390% ( 9) 00:07:37.241 12351.015 - 12401.428: 96.4874% ( 8) 00:07:37.241 12401.428 - 12451.840: 96.5237% ( 6) 00:07:37.241 12451.840 - 12502.252: 96.5601% ( 6) 00:07:37.241 12502.252 - 12552.665: 96.5964% ( 6) 00:07:37.241 12552.665 - 12603.077: 96.6449% ( 8) 00:07:37.241 12603.077 - 12653.489: 96.6812% ( 6) 00:07:37.241 12653.489 - 12703.902: 96.7236% ( 7) 00:07:37.241 12703.902 - 12754.314: 96.7599% ( 6) 00:07:37.241 12754.314 - 12804.726: 96.8023% ( 7) 00:07:37.241 12804.726 - 12855.138: 96.8387% ( 6) 00:07:37.241 12855.138 - 12905.551: 96.8568% ( 3) 00:07:37.241 12905.551 - 13006.375: 96.8932% ( 6) 00:07:37.241 13006.375 - 13107.200: 96.9053% ( 2) 00:07:37.241 13107.200 - 13208.025: 97.0385% ( 22) 00:07:37.241 13208.025 - 13308.849: 97.0991% ( 10) 00:07:37.241 13308.849 - 13409.674: 97.1778% ( 13) 00:07:37.241 13409.674 - 13510.498: 97.3110% ( 22) 00:07:37.241 13510.498 - 13611.323: 97.4261% ( 19) 00:07:37.241 13611.323 - 13712.148: 97.5654% ( 23) 00:07:37.241 13712.148 - 13812.972: 97.7047% ( 23) 00:07:37.241 13812.972 - 13913.797: 97.8561% ( 25) 00:07:37.241 13913.797 - 14014.622: 98.1529% ( 49) 00:07:37.242 14014.622 - 14115.446: 98.3345% ( 30) 00:07:37.242 14115.446 - 14216.271: 98.3769% ( 7) 00:07:37.242 14216.271 - 14317.095: 98.3951% ( 3) 00:07:37.242 14317.095 - 14417.920: 98.4193% ( 4) 00:07:37.242 14417.920 - 14518.745: 98.4375% ( 3) 00:07:37.242 14518.745 - 14619.569: 98.4496% ( 2) 00:07:37.242 15325.342 - 15426.166: 98.4557% ( 1) 00:07:37.242 15627.815 - 15728.640: 98.4738% ( 3) 00:07:37.242 15728.640 - 15829.465: 98.4981% ( 4) 00:07:37.242 15829.465 - 15930.289: 98.5223% ( 4) 00:07:37.242 15930.289 - 16031.114: 98.5526% ( 5) 00:07:37.242 16031.114 - 16131.938: 98.5950% ( 7) 00:07:37.242 16131.938 - 16232.763: 98.6858% ( 15) 00:07:37.242 16232.763 - 16333.588: 98.7766% ( 15) 00:07:37.242 16333.588 - 16434.412: 98.8917% ( 19) 00:07:37.242 16434.412 - 16535.237: 99.1097% ( 36) 00:07:37.242 16535.237 - 16636.062: 99.3399% ( 38) 00:07:37.242 16636.062 - 16736.886: 99.4065% ( 11) 00:07:37.242 16736.886 - 16837.711: 99.4610% ( 9) 00:07:37.242 16837.711 - 16938.535: 99.4973% ( 6) 00:07:37.242 16938.535 - 17039.360: 99.5276% ( 5) 00:07:37.242 17039.360 - 17140.185: 99.5579% ( 5) 00:07:37.242 17140.185 - 17241.009: 99.5942% ( 6) 00:07:37.242 17241.009 - 17341.834: 99.6124% ( 3) 00:07:37.242 21273.994 - 21374.818: 99.6366% ( 4) 00:07:37.242 21374.818 - 21475.643: 99.6548% ( 3) 00:07:37.242 21475.643 - 21576.468: 99.6911% ( 6) 00:07:37.242 21576.468 - 21677.292: 99.8062% ( 19) 00:07:37.242 21677.292 - 21778.117: 99.8486% ( 7) 00:07:37.242 21878.942 - 21979.766: 99.8668% ( 3) 00:07:37.242 21979.766 - 22080.591: 99.8910% ( 4) 00:07:37.242 22080.591 - 22181.415: 99.9152% ( 4) 00:07:37.242 22181.415 - 22282.240: 99.9334% ( 3) 00:07:37.242 22282.240 - 22383.065: 99.9576% ( 4) 00:07:37.242 22383.065 - 22483.889: 99.9758% ( 3) 00:07:37.242 22483.889 - 22584.714: 100.0000% ( 4) 00:07:37.242 00:07:37.242 Latency histogram for PCIE (0000:00:12.0) NSID 3 from core 0: 00:07:37.242 ============================================================================== 00:07:37.242 Range in us Cumulative IO count 00:07:37.242 4108.603 - 4133.809: 0.0061% ( 1) 00:07:37.242 4184.222 - 4209.428: 0.0182% ( 2) 00:07:37.242 4209.428 - 4234.634: 0.0908% ( 12) 00:07:37.242 4234.634 - 4259.840: 0.1514% ( 10) 00:07:37.242 4259.840 - 4285.046: 0.2483% ( 16) 00:07:37.242 4285.046 - 4310.252: 0.2907% ( 7) 00:07:37.242 4310.252 - 4335.458: 0.3089% ( 3) 00:07:37.242 4335.458 - 4360.665: 0.3210% ( 2) 00:07:37.242 4360.665 - 4385.871: 0.3331% ( 2) 00:07:37.242 4385.871 - 4411.077: 0.3452% ( 2) 00:07:37.242 4411.077 - 4436.283: 0.3513% ( 1) 00:07:37.242 4436.283 - 4461.489: 0.3634% ( 2) 00:07:37.242 4461.489 - 4486.695: 0.3755% ( 2) 00:07:37.242 4486.695 - 4511.902: 0.3876% ( 2) 00:07:37.242 5620.972 - 5646.178: 0.3937% ( 1) 00:07:37.242 5772.209 - 5797.415: 0.4239% ( 5) 00:07:37.242 5797.415 - 5822.622: 0.4542% ( 5) 00:07:37.242 5822.622 - 5847.828: 0.5087% ( 9) 00:07:37.242 5847.828 - 5873.034: 0.5814% ( 12) 00:07:37.242 5873.034 - 5898.240: 0.6541% ( 12) 00:07:37.242 5898.240 - 5923.446: 0.6662% ( 2) 00:07:37.242 5923.446 - 5948.652: 0.6783% ( 2) 00:07:37.242 5948.652 - 5973.858: 0.6844% ( 1) 00:07:37.242 5973.858 - 5999.065: 0.6965% ( 2) 00:07:37.242 5999.065 - 6024.271: 0.7086% ( 2) 00:07:37.242 6024.271 - 6049.477: 0.7207% ( 2) 00:07:37.242 6049.477 - 6074.683: 0.7328% ( 2) 00:07:37.242 6074.683 - 6099.889: 0.7389% ( 1) 00:07:37.242 6099.889 - 6125.095: 0.7510% ( 2) 00:07:37.242 6125.095 - 6150.302: 0.7691% ( 3) 00:07:37.242 6150.302 - 6175.508: 0.7752% ( 1) 00:07:37.242 6276.332 - 6301.538: 0.7812% ( 1) 00:07:37.242 6301.538 - 6326.745: 0.7994% ( 3) 00:07:37.242 6326.745 - 6351.951: 0.8115% ( 2) 00:07:37.242 6351.951 - 6377.157: 0.8297% ( 3) 00:07:37.242 6402.363 - 6427.569: 0.8418% ( 2) 00:07:37.242 6427.569 - 6452.775: 0.8539% ( 2) 00:07:37.242 6452.775 - 6503.188: 0.9387% ( 14) 00:07:37.242 6503.188 - 6553.600: 1.0538% ( 19) 00:07:37.242 6553.600 - 6604.012: 1.2718% ( 36) 00:07:37.242 6604.012 - 6654.425: 1.9138% ( 106) 00:07:37.242 6654.425 - 6704.837: 2.5012% ( 97) 00:07:37.242 6704.837 - 6755.249: 3.7185% ( 201) 00:07:37.242 6755.249 - 6805.662: 5.8745% ( 356) 00:07:37.242 6805.662 - 6856.074: 8.2183% ( 387) 00:07:37.242 6856.074 - 6906.486: 10.9738% ( 455) 00:07:37.242 6906.486 - 6956.898: 15.6734% ( 776) 00:07:37.242 6956.898 - 7007.311: 21.4995% ( 962) 00:07:37.242 7007.311 - 7057.723: 29.1485% ( 1263) 00:07:37.242 7057.723 - 7108.135: 37.0942% ( 1312) 00:07:37.242 7108.135 - 7158.548: 44.9128% ( 1291) 00:07:37.242 7158.548 - 7208.960: 51.7805% ( 1134) 00:07:37.242 7208.960 - 7259.372: 58.6785% ( 1139) 00:07:37.242 7259.372 - 7309.785: 63.5477% ( 804) 00:07:37.242 7309.785 - 7360.197: 66.8484% ( 545) 00:07:37.242 7360.197 - 7410.609: 70.2398% ( 560) 00:07:37.242 7410.609 - 7461.022: 72.8682% ( 434) 00:07:37.242 7461.022 - 7511.434: 74.8304% ( 324) 00:07:37.242 7511.434 - 7561.846: 76.3324% ( 248) 00:07:37.242 7561.846 - 7612.258: 77.6102% ( 211) 00:07:37.242 7612.258 - 7662.671: 78.6579% ( 173) 00:07:37.242 7662.671 - 7713.083: 79.8510% ( 197) 00:07:37.242 7713.083 - 7763.495: 80.6747% ( 136) 00:07:37.242 7763.495 - 7813.908: 81.4438% ( 127) 00:07:37.242 7813.908 - 7864.320: 82.2008% ( 125) 00:07:37.242 7864.320 - 7914.732: 82.7822% ( 96) 00:07:37.242 7914.732 - 7965.145: 83.2485% ( 77) 00:07:37.242 7965.145 - 8015.557: 83.8239% ( 95) 00:07:37.242 8015.557 - 8065.969: 84.1994% ( 62) 00:07:37.242 8065.969 - 8116.382: 84.7323% ( 88) 00:07:37.242 8116.382 - 8166.794: 84.9927% ( 43) 00:07:37.242 8166.794 - 8217.206: 85.2410% ( 41) 00:07:37.242 8217.206 - 8267.618: 85.7376% ( 82) 00:07:37.242 8267.618 - 8318.031: 86.2282% ( 81) 00:07:37.242 8318.031 - 8368.443: 86.5855% ( 59) 00:07:37.242 8368.443 - 8418.855: 86.9247% ( 56) 00:07:37.242 8418.855 - 8469.268: 87.0942% ( 28) 00:07:37.242 8469.268 - 8519.680: 87.2699% ( 29) 00:07:37.242 8519.680 - 8570.092: 87.4092% ( 23) 00:07:37.242 8570.092 - 8620.505: 87.5848% ( 29) 00:07:37.242 8620.505 - 8670.917: 87.7725% ( 31) 00:07:37.242 8670.917 - 8721.329: 88.0572% ( 47) 00:07:37.242 8721.329 - 8771.742: 88.6204% ( 93) 00:07:37.242 8771.742 - 8822.154: 89.2381% ( 102) 00:07:37.242 8822.154 - 8872.566: 89.6681% ( 71) 00:07:37.242 8872.566 - 8922.978: 90.0375% ( 61) 00:07:37.242 8922.978 - 8973.391: 90.2495% ( 35) 00:07:37.242 8973.391 - 9023.803: 90.4615% ( 35) 00:07:37.242 9023.803 - 9074.215: 90.6674% ( 34) 00:07:37.242 9074.215 - 9124.628: 90.8975% ( 38) 00:07:37.242 9124.628 - 9175.040: 91.0308% ( 22) 00:07:37.242 9175.040 - 9225.452: 91.1640% ( 22) 00:07:37.242 9225.452 - 9275.865: 91.2912% ( 21) 00:07:37.242 9275.865 - 9326.277: 91.3881% ( 16) 00:07:37.242 9326.277 - 9376.689: 91.4729% ( 14) 00:07:37.242 9376.689 - 9427.102: 91.5334% ( 10) 00:07:37.242 9427.102 - 9477.514: 91.6122% ( 13) 00:07:37.242 9477.514 - 9527.926: 91.8605% ( 41) 00:07:37.242 9527.926 - 9578.338: 91.9210% ( 10) 00:07:37.242 9578.338 - 9628.751: 91.9695% ( 8) 00:07:37.242 9628.751 - 9679.163: 92.0119% ( 7) 00:07:37.242 9679.163 - 9729.575: 92.0543% ( 7) 00:07:37.242 9729.575 - 9779.988: 92.1209% ( 11) 00:07:37.242 9779.988 - 9830.400: 92.2117% ( 15) 00:07:37.242 9830.400 - 9880.812: 92.2965% ( 14) 00:07:37.242 9880.812 - 9931.225: 92.3328% ( 6) 00:07:37.242 9931.225 - 9981.637: 92.3934% ( 10) 00:07:37.242 9981.637 - 10032.049: 92.5145% ( 20) 00:07:37.242 10032.049 - 10082.462: 92.6357% ( 20) 00:07:37.242 10082.462 - 10132.874: 92.7386% ( 17) 00:07:37.242 10132.874 - 10183.286: 92.7871% ( 8) 00:07:37.242 10183.286 - 10233.698: 92.8719% ( 14) 00:07:37.242 10233.698 - 10284.111: 92.9445% ( 12) 00:07:37.242 10284.111 - 10334.523: 93.0233% ( 13) 00:07:37.242 10334.523 - 10384.935: 93.1747% ( 25) 00:07:37.242 10384.935 - 10435.348: 93.3200% ( 24) 00:07:37.242 10435.348 - 10485.760: 93.3987% ( 13) 00:07:37.242 10485.760 - 10536.172: 93.5199% ( 20) 00:07:37.242 10536.172 - 10586.585: 93.6410% ( 20) 00:07:37.242 10586.585 - 10636.997: 93.7803% ( 23) 00:07:37.242 10636.997 - 10687.409: 94.1860% ( 67) 00:07:37.242 10687.409 - 10737.822: 94.2890% ( 17) 00:07:37.242 10737.822 - 10788.234: 94.4101% ( 20) 00:07:37.242 10788.234 - 10838.646: 94.5312% ( 20) 00:07:37.242 10838.646 - 10889.058: 94.6403% ( 18) 00:07:37.242 10889.058 - 10939.471: 94.7069% ( 11) 00:07:37.242 10939.471 - 10989.883: 94.7553% ( 8) 00:07:37.242 10989.883 - 11040.295: 94.8038% ( 8) 00:07:37.242 11040.295 - 11090.708: 94.8401% ( 6) 00:07:37.242 11090.708 - 11141.120: 94.8704% ( 5) 00:07:37.242 11141.120 - 11191.532: 94.9067% ( 6) 00:07:37.242 11191.532 - 11241.945: 94.9370% ( 5) 00:07:37.242 11241.945 - 11292.357: 94.9673% ( 5) 00:07:37.242 11292.357 - 11342.769: 94.9976% ( 5) 00:07:37.242 11342.769 - 11393.182: 95.0339% ( 6) 00:07:37.242 11393.182 - 11443.594: 95.2641% ( 38) 00:07:37.242 11443.594 - 11494.006: 95.3004% ( 6) 00:07:37.242 11494.006 - 11544.418: 95.3307% ( 5) 00:07:37.242 11544.418 - 11594.831: 95.3791% ( 8) 00:07:37.242 11594.831 - 11645.243: 95.4336% ( 9) 00:07:37.242 11645.243 - 11695.655: 95.4821% ( 8) 00:07:37.242 11695.655 - 11746.068: 95.5124% ( 5) 00:07:37.242 11746.068 - 11796.480: 95.5426% ( 5) 00:07:37.242 11796.480 - 11846.892: 95.5729% ( 5) 00:07:37.242 11846.892 - 11897.305: 95.6032% ( 5) 00:07:37.242 11897.305 - 11947.717: 95.6819% ( 13) 00:07:37.242 11947.717 - 11998.129: 95.7485% ( 11) 00:07:37.242 11998.129 - 12048.542: 95.8515% ( 17) 00:07:37.242 12048.542 - 12098.954: 95.9423% ( 15) 00:07:37.242 12098.954 - 12149.366: 96.0392% ( 16) 00:07:37.242 12149.366 - 12199.778: 96.1180% ( 13) 00:07:37.242 12199.778 - 12250.191: 96.2149% ( 16) 00:07:37.243 12250.191 - 12300.603: 96.2936% ( 13) 00:07:37.243 12300.603 - 12351.015: 96.3844% ( 15) 00:07:37.243 12351.015 - 12401.428: 96.4692% ( 14) 00:07:37.243 12401.428 - 12451.840: 96.5419% ( 12) 00:07:37.243 12451.840 - 12502.252: 96.6025% ( 10) 00:07:37.243 12502.252 - 12552.665: 96.6691% ( 11) 00:07:37.243 12552.665 - 12603.077: 96.6933% ( 4) 00:07:37.243 12603.077 - 12653.489: 96.7781% ( 14) 00:07:37.243 12653.489 - 12703.902: 96.8205% ( 7) 00:07:37.243 12703.902 - 12754.314: 96.8750% ( 9) 00:07:37.243 12754.314 - 12804.726: 96.9416% ( 11) 00:07:37.243 12804.726 - 12855.138: 97.0082% ( 11) 00:07:37.243 12855.138 - 12905.551: 97.0627% ( 9) 00:07:37.243 12905.551 - 13006.375: 97.1839% ( 20) 00:07:37.243 13006.375 - 13107.200: 97.3353% ( 25) 00:07:37.243 13107.200 - 13208.025: 97.3898% ( 9) 00:07:37.243 13208.025 - 13308.849: 97.4685% ( 13) 00:07:37.243 13308.849 - 13409.674: 97.5412% ( 12) 00:07:37.243 13409.674 - 13510.498: 97.6078% ( 11) 00:07:37.243 13510.498 - 13611.323: 97.6986% ( 15) 00:07:37.243 13611.323 - 13712.148: 97.7895% ( 15) 00:07:37.243 13712.148 - 13812.972: 97.8682% ( 13) 00:07:37.243 13812.972 - 13913.797: 97.9106% ( 7) 00:07:37.243 13913.797 - 14014.622: 97.9712% ( 10) 00:07:37.243 14014.622 - 14115.446: 98.0378% ( 11) 00:07:37.243 14115.446 - 14216.271: 98.0923% ( 9) 00:07:37.243 14216.271 - 14317.095: 98.1468% ( 9) 00:07:37.243 14317.095 - 14417.920: 98.2437% ( 16) 00:07:37.243 14417.920 - 14518.745: 98.3891% ( 24) 00:07:37.243 14518.745 - 14619.569: 98.4193% ( 5) 00:07:37.243 14619.569 - 14720.394: 98.4496% ( 5) 00:07:37.243 15325.342 - 15426.166: 98.4557% ( 1) 00:07:37.243 15426.166 - 15526.991: 98.4617% ( 1) 00:07:37.243 15728.640 - 15829.465: 98.4678% ( 1) 00:07:37.243 15829.465 - 15930.289: 98.4799% ( 2) 00:07:37.243 15930.289 - 16031.114: 98.5223% ( 7) 00:07:37.243 16031.114 - 16131.938: 98.5828% ( 10) 00:07:37.243 16131.938 - 16232.763: 98.6192% ( 6) 00:07:37.243 16232.763 - 16333.588: 98.7221% ( 17) 00:07:37.243 16333.588 - 16434.412: 98.9038% ( 30) 00:07:37.243 16434.412 - 16535.237: 99.0189% ( 19) 00:07:37.243 16535.237 - 16636.062: 99.0734% ( 9) 00:07:37.243 16636.062 - 16736.886: 99.1340% ( 10) 00:07:37.243 16736.886 - 16837.711: 99.2006% ( 11) 00:07:37.243 16837.711 - 16938.535: 99.2611% ( 10) 00:07:37.243 16938.535 - 17039.360: 99.3156% ( 9) 00:07:37.243 17039.360 - 17140.185: 99.3580% ( 7) 00:07:37.243 17140.185 - 17241.009: 99.3944% ( 6) 00:07:37.243 17241.009 - 17341.834: 99.4610% ( 11) 00:07:37.243 17341.834 - 17442.658: 99.5397% ( 13) 00:07:37.243 17442.658 - 17543.483: 99.6124% ( 12) 00:07:37.243 20870.695 - 20971.520: 99.6245% ( 2) 00:07:37.243 20971.520 - 21072.345: 99.6609% ( 6) 00:07:37.243 21072.345 - 21173.169: 99.8910% ( 38) 00:07:37.243 21173.169 - 21273.994: 99.9273% ( 6) 00:07:37.243 21273.994 - 21374.818: 99.9576% ( 5) 00:07:37.243 21374.818 - 21475.643: 99.9758% ( 3) 00:07:37.243 21475.643 - 21576.468: 99.9879% ( 2) 00:07:37.243 21576.468 - 21677.292: 100.0000% ( 2) 00:07:37.243 00:07:37.243 04:57:57 nvme.nvme_perf -- nvme/nvme.sh@24 -- # '[' -b /dev/ram0 ']' 00:07:37.243 00:07:37.243 real 0m2.442s 00:07:37.243 user 0m2.194s 00:07:37.243 sys 0m0.164s 00:07:37.243 04:57:57 nvme.nvme_perf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:37.243 04:57:57 nvme.nvme_perf -- common/autotest_common.sh@10 -- # set +x 00:07:37.243 ************************************ 00:07:37.243 END TEST nvme_perf 00:07:37.243 ************************************ 00:07:37.243 04:57:57 nvme -- nvme/nvme.sh@87 -- # run_test nvme_hello_world /home/vagrant/spdk_repo/spdk/build/examples/hello_world -i 0 00:07:37.243 04:57:57 nvme -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:07:37.243 04:57:57 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:37.243 04:57:57 nvme -- common/autotest_common.sh@10 -- # set +x 00:07:37.243 ************************************ 00:07:37.243 START TEST nvme_hello_world 00:07:37.243 ************************************ 00:07:37.243 04:57:57 nvme.nvme_hello_world -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/hello_world -i 0 00:07:37.243 Initializing NVMe Controllers 00:07:37.243 Attached to 0000:00:10.0 00:07:37.243 Namespace ID: 1 size: 6GB 00:07:37.243 Attached to 0000:00:11.0 00:07:37.243 Namespace ID: 1 size: 5GB 00:07:37.243 Attached to 0000:00:13.0 00:07:37.243 Namespace ID: 1 size: 1GB 00:07:37.243 Attached to 0000:00:12.0 00:07:37.243 Namespace ID: 1 size: 4GB 00:07:37.243 Namespace ID: 2 size: 4GB 00:07:37.243 Namespace ID: 3 size: 4GB 00:07:37.243 Initialization complete. 00:07:37.243 INFO: using host memory buffer for IO 00:07:37.243 Hello world! 00:07:37.243 INFO: using host memory buffer for IO 00:07:37.243 Hello world! 00:07:37.243 INFO: using host memory buffer for IO 00:07:37.243 Hello world! 00:07:37.243 INFO: using host memory buffer for IO 00:07:37.243 Hello world! 00:07:37.243 INFO: using host memory buffer for IO 00:07:37.243 Hello world! 00:07:37.243 INFO: using host memory buffer for IO 00:07:37.243 Hello world! 00:07:37.243 00:07:37.243 real 0m0.199s 00:07:37.243 user 0m0.059s 00:07:37.243 sys 0m0.099s 00:07:37.243 04:57:57 nvme.nvme_hello_world -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:37.243 04:57:57 nvme.nvme_hello_world -- common/autotest_common.sh@10 -- # set +x 00:07:37.243 ************************************ 00:07:37.243 END TEST nvme_hello_world 00:07:37.243 ************************************ 00:07:37.243 04:57:57 nvme -- nvme/nvme.sh@88 -- # run_test nvme_sgl /home/vagrant/spdk_repo/spdk/test/nvme/sgl/sgl 00:07:37.243 04:57:57 nvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:37.243 04:57:57 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:37.243 04:57:57 nvme -- common/autotest_common.sh@10 -- # set +x 00:07:37.243 ************************************ 00:07:37.243 START TEST nvme_sgl 00:07:37.243 ************************************ 00:07:37.243 04:57:57 nvme.nvme_sgl -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/sgl/sgl 00:07:37.502 0000:00:10.0: build_io_request_0 Invalid IO length parameter 00:07:37.502 0000:00:10.0: build_io_request_1 Invalid IO length parameter 00:07:37.502 0000:00:10.0: build_io_request_3 Invalid IO length parameter 00:07:37.502 0000:00:10.0: build_io_request_8 Invalid IO length parameter 00:07:37.502 0000:00:10.0: build_io_request_9 Invalid IO length parameter 00:07:37.502 0000:00:10.0: build_io_request_11 Invalid IO length parameter 00:07:37.502 0000:00:11.0: build_io_request_0 Invalid IO length parameter 00:07:37.502 0000:00:11.0: build_io_request_1 Invalid IO length parameter 00:07:37.502 0000:00:11.0: build_io_request_3 Invalid IO length parameter 00:07:37.502 0000:00:11.0: build_io_request_8 Invalid IO length parameter 00:07:37.502 0000:00:11.0: build_io_request_9 Invalid IO length parameter 00:07:37.502 0000:00:11.0: build_io_request_11 Invalid IO length parameter 00:07:37.502 0000:00:13.0: build_io_request_0 Invalid IO length parameter 00:07:37.502 0000:00:13.0: build_io_request_1 Invalid IO length parameter 00:07:37.502 0000:00:13.0: build_io_request_2 Invalid IO length parameter 00:07:37.502 0000:00:13.0: build_io_request_3 Invalid IO length parameter 00:07:37.502 0000:00:13.0: build_io_request_4 Invalid IO length parameter 00:07:37.502 0000:00:13.0: build_io_request_5 Invalid IO length parameter 00:07:37.502 0000:00:13.0: build_io_request_6 Invalid IO length parameter 00:07:37.502 0000:00:13.0: build_io_request_7 Invalid IO length parameter 00:07:37.502 0000:00:13.0: build_io_request_8 Invalid IO length parameter 00:07:37.502 0000:00:13.0: build_io_request_9 Invalid IO length parameter 00:07:37.502 0000:00:13.0: build_io_request_10 Invalid IO length parameter 00:07:37.502 0000:00:13.0: build_io_request_11 Invalid IO length parameter 00:07:37.502 0000:00:12.0: build_io_request_0 Invalid IO length parameter 00:07:37.502 0000:00:12.0: build_io_request_1 Invalid IO length parameter 00:07:37.502 0000:00:12.0: build_io_request_2 Invalid IO length parameter 00:07:37.502 0000:00:12.0: build_io_request_3 Invalid IO length parameter 00:07:37.502 0000:00:12.0: build_io_request_4 Invalid IO length parameter 00:07:37.502 0000:00:12.0: build_io_request_5 Invalid IO length parameter 00:07:37.502 0000:00:12.0: build_io_request_6 Invalid IO length parameter 00:07:37.502 0000:00:12.0: build_io_request_7 Invalid IO length parameter 00:07:37.502 0000:00:12.0: build_io_request_8 Invalid IO length parameter 00:07:37.502 0000:00:12.0: build_io_request_9 Invalid IO length parameter 00:07:37.502 0000:00:12.0: build_io_request_10 Invalid IO length parameter 00:07:37.502 0000:00:12.0: build_io_request_11 Invalid IO length parameter 00:07:37.502 NVMe Readv/Writev Request test 00:07:37.502 Attached to 0000:00:10.0 00:07:37.502 Attached to 0000:00:11.0 00:07:37.502 Attached to 0000:00:13.0 00:07:37.502 Attached to 0000:00:12.0 00:07:37.502 0000:00:10.0: build_io_request_2 test passed 00:07:37.502 0000:00:10.0: build_io_request_4 test passed 00:07:37.502 0000:00:10.0: build_io_request_5 test passed 00:07:37.502 0000:00:10.0: build_io_request_6 test passed 00:07:37.502 0000:00:10.0: build_io_request_7 test passed 00:07:37.502 0000:00:10.0: build_io_request_10 test passed 00:07:37.502 0000:00:11.0: build_io_request_2 test passed 00:07:37.502 0000:00:11.0: build_io_request_4 test passed 00:07:37.502 0000:00:11.0: build_io_request_5 test passed 00:07:37.502 0000:00:11.0: build_io_request_6 test passed 00:07:37.502 0000:00:11.0: build_io_request_7 test passed 00:07:37.502 0000:00:11.0: build_io_request_10 test passed 00:07:37.502 Cleaning up... 00:07:37.502 00:07:37.502 real 0m0.257s 00:07:37.502 user 0m0.131s 00:07:37.502 sys 0m0.086s 00:07:37.502 04:57:57 nvme.nvme_sgl -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:37.502 04:57:57 nvme.nvme_sgl -- common/autotest_common.sh@10 -- # set +x 00:07:37.502 ************************************ 00:07:37.502 END TEST nvme_sgl 00:07:37.502 ************************************ 00:07:37.502 04:57:57 nvme -- nvme/nvme.sh@89 -- # run_test nvme_e2edp /home/vagrant/spdk_repo/spdk/test/nvme/e2edp/nvme_dp 00:07:37.502 04:57:57 nvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:37.502 04:57:57 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:37.502 04:57:57 nvme -- common/autotest_common.sh@10 -- # set +x 00:07:37.502 ************************************ 00:07:37.502 START TEST nvme_e2edp 00:07:37.502 ************************************ 00:07:37.502 04:57:57 nvme.nvme_e2edp -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/e2edp/nvme_dp 00:07:37.760 NVMe Write/Read with End-to-End data protection test 00:07:37.760 Attached to 0000:00:10.0 00:07:37.760 Attached to 0000:00:11.0 00:07:37.760 Attached to 0000:00:13.0 00:07:37.760 Attached to 0000:00:12.0 00:07:37.760 Cleaning up... 00:07:37.760 00:07:37.760 real 0m0.189s 00:07:37.760 user 0m0.064s 00:07:37.760 sys 0m0.090s 00:07:37.760 04:57:57 nvme.nvme_e2edp -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:37.760 04:57:57 nvme.nvme_e2edp -- common/autotest_common.sh@10 -- # set +x 00:07:37.760 ************************************ 00:07:37.760 END TEST nvme_e2edp 00:07:37.760 ************************************ 00:07:37.760 04:57:57 nvme -- nvme/nvme.sh@90 -- # run_test nvme_reserve /home/vagrant/spdk_repo/spdk/test/nvme/reserve/reserve 00:07:37.760 04:57:57 nvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:37.760 04:57:57 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:37.760 04:57:57 nvme -- common/autotest_common.sh@10 -- # set +x 00:07:37.760 ************************************ 00:07:37.760 START TEST nvme_reserve 00:07:37.760 ************************************ 00:07:37.760 04:57:57 nvme.nvme_reserve -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/reserve/reserve 00:07:38.019 ===================================================== 00:07:38.019 NVMe Controller at PCI bus 0, device 16, function 0 00:07:38.019 ===================================================== 00:07:38.019 Reservations: Not Supported 00:07:38.019 ===================================================== 00:07:38.019 NVMe Controller at PCI bus 0, device 17, function 0 00:07:38.019 ===================================================== 00:07:38.019 Reservations: Not Supported 00:07:38.019 ===================================================== 00:07:38.019 NVMe Controller at PCI bus 0, device 19, function 0 00:07:38.019 ===================================================== 00:07:38.019 Reservations: Not Supported 00:07:38.019 ===================================================== 00:07:38.019 NVMe Controller at PCI bus 0, device 18, function 0 00:07:38.019 ===================================================== 00:07:38.019 Reservations: Not Supported 00:07:38.019 Reservation test passed 00:07:38.019 00:07:38.019 real 0m0.181s 00:07:38.019 user 0m0.065s 00:07:38.019 sys 0m0.083s 00:07:38.019 04:57:57 nvme.nvme_reserve -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:38.019 04:57:57 nvme.nvme_reserve -- common/autotest_common.sh@10 -- # set +x 00:07:38.019 ************************************ 00:07:38.019 END TEST nvme_reserve 00:07:38.019 ************************************ 00:07:38.019 04:57:58 nvme -- nvme/nvme.sh@91 -- # run_test nvme_err_injection /home/vagrant/spdk_repo/spdk/test/nvme/err_injection/err_injection 00:07:38.019 04:57:58 nvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:38.019 04:57:58 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:38.019 04:57:58 nvme -- common/autotest_common.sh@10 -- # set +x 00:07:38.019 ************************************ 00:07:38.019 START TEST nvme_err_injection 00:07:38.019 ************************************ 00:07:38.019 04:57:58 nvme.nvme_err_injection -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/err_injection/err_injection 00:07:38.277 NVMe Error Injection test 00:07:38.277 Attached to 0000:00:10.0 00:07:38.277 Attached to 0000:00:11.0 00:07:38.277 Attached to 0000:00:13.0 00:07:38.277 Attached to 0000:00:12.0 00:07:38.277 0000:00:12.0: get features failed as expected 00:07:38.277 0000:00:10.0: get features failed as expected 00:07:38.277 0000:00:11.0: get features failed as expected 00:07:38.277 0000:00:13.0: get features failed as expected 00:07:38.277 0000:00:10.0: get features successfully as expected 00:07:38.277 0000:00:11.0: get features successfully as expected 00:07:38.277 0000:00:13.0: get features successfully as expected 00:07:38.277 0000:00:12.0: get features successfully as expected 00:07:38.277 0000:00:11.0: read failed as expected 00:07:38.277 0000:00:13.0: read failed as expected 00:07:38.277 0000:00:12.0: read failed as expected 00:07:38.277 0000:00:10.0: read failed as expected 00:07:38.277 0000:00:11.0: read successfully as expected 00:07:38.277 0000:00:13.0: read successfully as expected 00:07:38.277 0000:00:12.0: read successfully as expected 00:07:38.277 0000:00:10.0: read successfully as expected 00:07:38.277 Cleaning up... 00:07:38.277 00:07:38.277 real 0m0.220s 00:07:38.277 user 0m0.073s 00:07:38.277 sys 0m0.096s 00:07:38.277 04:57:58 nvme.nvme_err_injection -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:38.277 04:57:58 nvme.nvme_err_injection -- common/autotest_common.sh@10 -- # set +x 00:07:38.277 ************************************ 00:07:38.277 END TEST nvme_err_injection 00:07:38.277 ************************************ 00:07:38.277 04:57:58 nvme -- nvme/nvme.sh@92 -- # run_test nvme_overhead /home/vagrant/spdk_repo/spdk/test/nvme/overhead/overhead -o 4096 -t 1 -H -i 0 00:07:38.277 04:57:58 nvme -- common/autotest_common.sh@1105 -- # '[' 9 -le 1 ']' 00:07:38.277 04:57:58 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:38.277 04:57:58 nvme -- common/autotest_common.sh@10 -- # set +x 00:07:38.277 ************************************ 00:07:38.277 START TEST nvme_overhead 00:07:38.277 ************************************ 00:07:38.277 04:57:58 nvme.nvme_overhead -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/overhead/overhead -o 4096 -t 1 -H -i 0 00:07:39.652 Initializing NVMe Controllers 00:07:39.652 Attached to 0000:00:10.0 00:07:39.652 Attached to 0000:00:11.0 00:07:39.652 Attached to 0000:00:13.0 00:07:39.652 Attached to 0000:00:12.0 00:07:39.652 Initialization complete. Launching workers. 00:07:39.652 submit (in ns) avg, min, max = 12266.7, 10788.5, 198461.5 00:07:39.652 complete (in ns) avg, min, max = 8178.1, 7254.6, 140045.4 00:07:39.652 00:07:39.652 Submit histogram 00:07:39.652 ================ 00:07:39.652 Range in us Cumulative Count 00:07:39.652 10.782 - 10.831: 0.0186% ( 3) 00:07:39.652 10.831 - 10.880: 0.1055% ( 14) 00:07:39.652 10.880 - 10.929: 0.4343% ( 53) 00:07:39.652 10.929 - 10.978: 1.3339% ( 145) 00:07:39.652 10.978 - 11.028: 2.9346% ( 258) 00:07:39.652 11.028 - 11.077: 5.7699% ( 457) 00:07:39.652 11.077 - 11.126: 9.9826% ( 679) 00:07:39.652 11.126 - 11.175: 15.8208% ( 941) 00:07:39.652 11.175 - 11.225: 21.6776% ( 944) 00:07:39.652 11.225 - 11.274: 26.4921% ( 776) 00:07:39.652 11.274 - 11.323: 29.9293% ( 554) 00:07:39.652 11.323 - 11.372: 32.2559% ( 375) 00:07:39.652 11.372 - 11.422: 33.8255% ( 253) 00:07:39.652 11.422 - 11.471: 34.8616% ( 167) 00:07:39.652 11.471 - 11.520: 35.6496% ( 127) 00:07:39.652 11.520 - 11.569: 36.4934% ( 136) 00:07:39.652 11.569 - 11.618: 37.3123% ( 132) 00:07:39.652 11.618 - 11.668: 37.9762% ( 107) 00:07:39.652 11.668 - 11.717: 38.9192% ( 152) 00:07:39.652 11.717 - 11.766: 40.1539% ( 199) 00:07:39.652 11.766 - 11.815: 41.9097% ( 283) 00:07:39.652 11.815 - 11.865: 44.7574% ( 459) 00:07:39.652 11.865 - 11.914: 48.9453% ( 675) 00:07:39.652 11.914 - 11.963: 53.3503% ( 710) 00:07:39.652 11.963 - 12.012: 58.3261% ( 802) 00:07:39.652 12.012 - 12.062: 63.1654% ( 780) 00:07:39.652 12.062 - 12.111: 67.6076% ( 716) 00:07:39.652 12.111 - 12.160: 71.2309% ( 584) 00:07:39.652 12.160 - 12.209: 74.0539% ( 455) 00:07:39.652 12.209 - 12.258: 76.1509% ( 338) 00:07:39.652 12.258 - 12.308: 77.8757% ( 278) 00:07:39.652 12.308 - 12.357: 79.1103% ( 199) 00:07:39.652 12.357 - 12.406: 80.1712% ( 171) 00:07:39.652 12.406 - 12.455: 80.9902% ( 132) 00:07:39.652 12.455 - 12.505: 81.7037% ( 115) 00:07:39.652 12.505 - 12.554: 82.2869% ( 94) 00:07:39.652 12.554 - 12.603: 82.5847% ( 48) 00:07:39.652 12.603 - 12.702: 83.1307% ( 88) 00:07:39.652 12.702 - 12.800: 83.4223% ( 47) 00:07:39.652 12.800 - 12.898: 83.6456% ( 36) 00:07:39.652 12.898 - 12.997: 83.7263% ( 13) 00:07:39.652 12.997 - 13.095: 83.8628% ( 22) 00:07:39.652 13.095 - 13.194: 83.9372% ( 12) 00:07:39.652 13.194 - 13.292: 84.1730% ( 38) 00:07:39.652 13.292 - 13.391: 84.7624% ( 95) 00:07:39.652 13.391 - 13.489: 86.3134% ( 250) 00:07:39.652 13.489 - 13.588: 88.2554% ( 313) 00:07:39.652 13.588 - 13.686: 90.0050% ( 282) 00:07:39.652 13.686 - 13.785: 91.0535% ( 169) 00:07:39.652 13.785 - 13.883: 91.6181% ( 91) 00:07:39.652 13.883 - 13.982: 92.0648% ( 72) 00:07:39.653 13.982 - 14.080: 92.4122% ( 56) 00:07:39.653 14.080 - 14.178: 92.9085% ( 80) 00:07:39.653 14.178 - 14.277: 93.4855% ( 93) 00:07:39.653 14.277 - 14.375: 93.9260% ( 71) 00:07:39.653 14.375 - 14.474: 94.4038% ( 77) 00:07:39.653 14.474 - 14.572: 94.8443% ( 71) 00:07:39.653 14.572 - 14.671: 95.2662% ( 68) 00:07:39.653 14.671 - 14.769: 95.4957% ( 37) 00:07:39.653 14.769 - 14.868: 95.7563% ( 42) 00:07:39.653 14.868 - 14.966: 95.9362% ( 29) 00:07:39.653 14.966 - 15.065: 96.0417% ( 17) 00:07:39.653 15.065 - 15.163: 96.1782% ( 22) 00:07:39.653 15.163 - 15.262: 96.2650% ( 14) 00:07:39.653 15.262 - 15.360: 96.3271% ( 10) 00:07:39.653 15.360 - 15.458: 96.4326% ( 17) 00:07:39.653 15.458 - 15.557: 96.5194% ( 14) 00:07:39.653 15.557 - 15.655: 96.6311% ( 18) 00:07:39.653 15.655 - 15.754: 96.7552% ( 20) 00:07:39.653 15.754 - 15.852: 96.8234% ( 11) 00:07:39.653 15.852 - 15.951: 96.8855% ( 10) 00:07:39.653 15.951 - 16.049: 96.9537% ( 11) 00:07:39.653 16.049 - 16.148: 97.0096% ( 9) 00:07:39.653 16.148 - 16.246: 97.0716% ( 10) 00:07:39.653 16.246 - 16.345: 97.1212% ( 8) 00:07:39.653 16.345 - 16.443: 97.1709% ( 8) 00:07:39.653 16.443 - 16.542: 97.2267% ( 9) 00:07:39.653 16.542 - 16.640: 97.2639% ( 6) 00:07:39.653 16.640 - 16.738: 97.2949% ( 5) 00:07:39.653 16.738 - 16.837: 97.3260% ( 5) 00:07:39.653 16.837 - 16.935: 97.3694% ( 7) 00:07:39.653 16.935 - 17.034: 97.4128% ( 7) 00:07:39.653 17.034 - 17.132: 97.4376% ( 4) 00:07:39.653 17.132 - 17.231: 97.4997% ( 10) 00:07:39.653 17.231 - 17.329: 97.5307% ( 5) 00:07:39.653 17.329 - 17.428: 97.5865% ( 9) 00:07:39.653 17.428 - 17.526: 97.6362% ( 8) 00:07:39.653 17.526 - 17.625: 97.6982% ( 10) 00:07:39.653 17.625 - 17.723: 97.7417% ( 7) 00:07:39.653 17.723 - 17.822: 97.7851% ( 7) 00:07:39.653 17.822 - 17.920: 97.8161% ( 5) 00:07:39.653 17.920 - 18.018: 97.8719% ( 9) 00:07:39.653 18.018 - 18.117: 97.9216% ( 8) 00:07:39.653 18.117 - 18.215: 97.9898% ( 11) 00:07:39.653 18.215 - 18.314: 98.0457% ( 9) 00:07:39.653 18.314 - 18.412: 98.0953% ( 8) 00:07:39.653 18.412 - 18.511: 98.1325% ( 6) 00:07:39.653 18.511 - 18.609: 98.1573% ( 4) 00:07:39.653 18.609 - 18.708: 98.2442% ( 14) 00:07:39.653 18.708 - 18.806: 98.2690% ( 4) 00:07:39.653 18.806 - 18.905: 98.3435% ( 12) 00:07:39.653 18.905 - 19.003: 98.3559% ( 2) 00:07:39.653 19.003 - 19.102: 98.3869% ( 5) 00:07:39.653 19.102 - 19.200: 98.4179% ( 5) 00:07:39.653 19.200 - 19.298: 98.4303% ( 2) 00:07:39.653 19.298 - 19.397: 98.4365% ( 1) 00:07:39.653 19.397 - 19.495: 98.4427% ( 1) 00:07:39.653 19.495 - 19.594: 98.4489% ( 1) 00:07:39.653 19.594 - 19.692: 98.4613% ( 2) 00:07:39.653 19.692 - 19.791: 98.4676% ( 1) 00:07:39.653 19.791 - 19.889: 98.4800% ( 2) 00:07:39.653 19.889 - 19.988: 98.4862% ( 1) 00:07:39.653 19.988 - 20.086: 98.5234% ( 6) 00:07:39.653 20.086 - 20.185: 98.5978% ( 12) 00:07:39.653 20.185 - 20.283: 98.8212% ( 36) 00:07:39.653 20.283 - 20.382: 99.0259% ( 33) 00:07:39.653 20.382 - 20.480: 99.1376% ( 18) 00:07:39.653 20.480 - 20.578: 99.1810% ( 7) 00:07:39.653 20.578 - 20.677: 99.2431% ( 10) 00:07:39.653 20.677 - 20.775: 99.2803% ( 6) 00:07:39.653 20.775 - 20.874: 99.3610% ( 13) 00:07:39.653 20.874 - 20.972: 99.4292% ( 11) 00:07:39.653 20.972 - 21.071: 99.4602% ( 5) 00:07:39.653 21.071 - 21.169: 99.5099% ( 8) 00:07:39.653 21.169 - 21.268: 99.5285% ( 3) 00:07:39.653 21.268 - 21.366: 99.5347% ( 1) 00:07:39.653 21.366 - 21.465: 99.5533% ( 3) 00:07:39.653 21.465 - 21.563: 99.5719% ( 3) 00:07:39.653 21.563 - 21.662: 99.5905% ( 3) 00:07:39.653 21.760 - 21.858: 99.6091% ( 3) 00:07:39.653 21.858 - 21.957: 99.6215% ( 2) 00:07:39.653 22.055 - 22.154: 99.6277% ( 1) 00:07:39.653 22.252 - 22.351: 99.6339% ( 1) 00:07:39.653 22.351 - 22.449: 99.6402% ( 1) 00:07:39.653 22.449 - 22.548: 99.6588% ( 3) 00:07:39.653 22.646 - 22.745: 99.6650% ( 1) 00:07:39.653 22.942 - 23.040: 99.6836% ( 3) 00:07:39.653 23.532 - 23.631: 99.6898% ( 1) 00:07:39.653 23.828 - 23.926: 99.7022% ( 2) 00:07:39.653 24.123 - 24.222: 99.7084% ( 1) 00:07:39.653 24.222 - 24.320: 99.7146% ( 1) 00:07:39.653 25.108 - 25.206: 99.7208% ( 1) 00:07:39.653 25.206 - 25.403: 99.7270% ( 1) 00:07:39.653 25.403 - 25.600: 99.7332% ( 1) 00:07:39.653 25.600 - 25.797: 99.7394% ( 1) 00:07:39.653 26.191 - 26.388: 99.7518% ( 2) 00:07:39.653 26.978 - 27.175: 99.7580% ( 1) 00:07:39.653 29.932 - 30.129: 99.7642% ( 1) 00:07:39.653 30.917 - 31.114: 99.7704% ( 1) 00:07:39.653 31.114 - 31.311: 99.7766% ( 1) 00:07:39.653 31.311 - 31.508: 99.7891% ( 2) 00:07:39.653 31.508 - 31.705: 99.8139% ( 4) 00:07:39.653 31.705 - 31.902: 99.8325% ( 3) 00:07:39.653 31.902 - 32.098: 99.8821% ( 8) 00:07:39.653 32.098 - 32.295: 99.9069% ( 4) 00:07:39.653 32.295 - 32.492: 99.9131% ( 1) 00:07:39.653 32.492 - 32.689: 99.9193% ( 1) 00:07:39.653 32.886 - 33.083: 99.9380% ( 3) 00:07:39.653 33.083 - 33.280: 99.9442% ( 1) 00:07:39.653 38.006 - 38.203: 99.9504% ( 1) 00:07:39.653 43.323 - 43.520: 99.9566% ( 1) 00:07:39.653 43.717 - 43.914: 99.9628% ( 1) 00:07:39.653 51.988 - 52.382: 99.9690% ( 1) 00:07:39.653 52.775 - 53.169: 99.9752% ( 1) 00:07:39.653 55.926 - 56.320: 99.9814% ( 1) 00:07:39.653 66.560 - 66.954: 99.9876% ( 1) 00:07:39.653 95.311 - 95.705: 99.9938% ( 1) 00:07:39.653 197.711 - 198.498: 100.0000% ( 1) 00:07:39.653 00:07:39.653 Complete histogram 00:07:39.653 ================== 00:07:39.653 Range in us Cumulative Count 00:07:39.653 7.237 - 7.286: 0.0310% ( 5) 00:07:39.653 7.286 - 7.335: 0.5274% ( 80) 00:07:39.653 7.335 - 7.385: 2.3328% ( 291) 00:07:39.653 7.385 - 7.434: 7.2590% ( 794) 00:07:39.653 7.434 - 7.483: 16.9066% ( 1555) 00:07:39.653 7.483 - 7.532: 28.5892% ( 1883) 00:07:39.653 7.532 - 7.582: 36.4313% ( 1264) 00:07:39.653 7.582 - 7.631: 40.7557% ( 697) 00:07:39.653 7.631 - 7.680: 42.9396% ( 352) 00:07:39.653 7.680 - 7.729: 43.7461% ( 130) 00:07:39.653 7.729 - 7.778: 44.4348% ( 111) 00:07:39.653 7.778 - 7.828: 46.9351% ( 403) 00:07:39.653 7.828 - 7.877: 53.3875% ( 1040) 00:07:39.653 7.877 - 7.926: 61.7322% ( 1345) 00:07:39.653 7.926 - 7.975: 68.5383% ( 1097) 00:07:39.653 7.975 - 8.025: 73.7064% ( 833) 00:07:39.653 8.025 - 8.074: 77.1870% ( 561) 00:07:39.653 8.074 - 8.123: 79.7556% ( 414) 00:07:39.653 8.123 - 8.172: 81.3687% ( 260) 00:07:39.653 8.172 - 8.222: 82.5164% ( 185) 00:07:39.653 8.222 - 8.271: 83.2610% ( 120) 00:07:39.653 8.271 - 8.320: 83.7387% ( 77) 00:07:39.653 8.320 - 8.369: 84.1109% ( 60) 00:07:39.653 8.369 - 8.418: 84.3095% ( 32) 00:07:39.653 8.418 - 8.468: 84.4087% ( 16) 00:07:39.653 8.468 - 8.517: 84.5390% ( 21) 00:07:39.653 8.517 - 8.566: 84.6569% ( 19) 00:07:39.653 8.566 - 8.615: 84.7872% ( 21) 00:07:39.653 8.615 - 8.665: 84.8927% ( 17) 00:07:39.653 8.665 - 8.714: 85.0478% ( 25) 00:07:39.653 8.714 - 8.763: 85.1284% ( 13) 00:07:39.653 8.763 - 8.812: 85.2339% ( 17) 00:07:39.653 8.812 - 8.862: 85.2773% ( 7) 00:07:39.653 8.862 - 8.911: 85.2835% ( 1) 00:07:39.653 8.911 - 8.960: 85.3208% ( 6) 00:07:39.653 8.960 - 9.009: 85.3456% ( 4) 00:07:39.653 9.009 - 9.058: 85.3766% ( 5) 00:07:39.653 9.058 - 9.108: 85.4759% ( 16) 00:07:39.653 9.108 - 9.157: 85.8791% ( 65) 00:07:39.653 9.157 - 9.206: 86.9090% ( 166) 00:07:39.653 9.206 - 9.255: 89.1922% ( 368) 00:07:39.653 9.255 - 9.305: 91.3017% ( 340) 00:07:39.653 9.305 - 9.354: 92.6480% ( 217) 00:07:39.653 9.354 - 9.403: 93.4793% ( 134) 00:07:39.653 9.403 - 9.452: 94.2363% ( 122) 00:07:39.653 9.452 - 9.502: 94.8070% ( 92) 00:07:39.653 9.502 - 9.551: 95.1793% ( 60) 00:07:39.653 9.551 - 9.600: 95.4399% ( 42) 00:07:39.653 9.600 - 9.649: 95.6198% ( 29) 00:07:39.653 9.649 - 9.698: 95.6881% ( 11) 00:07:39.653 9.698 - 9.748: 95.7749% ( 14) 00:07:39.653 9.748 - 9.797: 95.8494% ( 12) 00:07:39.653 9.797 - 9.846: 95.8928% ( 7) 00:07:39.653 9.846 - 9.895: 95.9362% ( 7) 00:07:39.653 9.895 - 9.945: 95.9983% ( 10) 00:07:39.653 9.945 - 9.994: 96.0355% ( 6) 00:07:39.653 9.994 - 10.043: 96.0851% ( 8) 00:07:39.653 10.043 - 10.092: 96.1037% ( 3) 00:07:39.653 10.092 - 10.142: 96.1348% ( 5) 00:07:39.653 10.142 - 10.191: 96.1534% ( 3) 00:07:39.653 10.191 - 10.240: 96.2278% ( 12) 00:07:39.653 10.240 - 10.289: 96.2899% ( 10) 00:07:39.653 10.289 - 10.338: 96.2961% ( 1) 00:07:39.653 10.338 - 10.388: 96.3085% ( 2) 00:07:39.653 10.388 - 10.437: 96.3643% ( 9) 00:07:39.653 10.437 - 10.486: 96.4015% ( 6) 00:07:39.653 10.486 - 10.535: 96.4574% ( 9) 00:07:39.653 10.535 - 10.585: 96.5132% ( 9) 00:07:39.653 10.585 - 10.634: 96.5380% ( 4) 00:07:39.653 10.634 - 10.683: 96.5877% ( 8) 00:07:39.653 10.683 - 10.732: 96.6373% ( 8) 00:07:39.654 10.732 - 10.782: 96.6993% ( 10) 00:07:39.654 10.782 - 10.831: 96.7366% ( 6) 00:07:39.654 10.831 - 10.880: 96.8048% ( 11) 00:07:39.654 10.880 - 10.929: 96.8172% ( 2) 00:07:39.654 10.929 - 10.978: 96.8544% ( 6) 00:07:39.654 10.978 - 11.028: 96.8731% ( 3) 00:07:39.654 11.028 - 11.077: 96.9103% ( 6) 00:07:39.654 11.077 - 11.126: 96.9475% ( 6) 00:07:39.654 11.126 - 11.175: 96.9909% ( 7) 00:07:39.654 11.175 - 11.225: 97.0096% ( 3) 00:07:39.654 11.225 - 11.274: 97.0282% ( 3) 00:07:39.654 11.274 - 11.323: 97.0530% ( 4) 00:07:39.654 11.323 - 11.372: 97.0902% ( 6) 00:07:39.654 11.372 - 11.422: 97.1088% ( 3) 00:07:39.654 11.422 - 11.471: 97.1212% ( 2) 00:07:39.654 11.471 - 11.520: 97.1460% ( 4) 00:07:39.654 11.520 - 11.569: 97.1523% ( 1) 00:07:39.654 11.618 - 11.668: 97.1771% ( 4) 00:07:39.654 11.717 - 11.766: 97.1833% ( 1) 00:07:39.654 11.766 - 11.815: 97.1957% ( 2) 00:07:39.654 11.815 - 11.865: 97.2143% ( 3) 00:07:39.654 11.865 - 11.914: 97.2391% ( 4) 00:07:39.654 11.914 - 11.963: 97.2701% ( 5) 00:07:39.654 11.963 - 12.012: 97.2763% ( 1) 00:07:39.654 12.012 - 12.062: 97.2887% ( 2) 00:07:39.654 12.062 - 12.111: 97.3012% ( 2) 00:07:39.654 12.111 - 12.160: 97.3136% ( 2) 00:07:39.654 12.160 - 12.209: 97.3198% ( 1) 00:07:39.654 12.209 - 12.258: 97.3322% ( 2) 00:07:39.654 12.308 - 12.357: 97.3384% ( 1) 00:07:39.654 12.406 - 12.455: 97.3446% ( 1) 00:07:39.654 12.603 - 12.702: 97.3694% ( 4) 00:07:39.654 12.702 - 12.800: 97.3818% ( 2) 00:07:39.654 12.800 - 12.898: 97.3880% ( 1) 00:07:39.654 12.898 - 12.997: 97.4066% ( 3) 00:07:39.654 12.997 - 13.095: 97.4252% ( 3) 00:07:39.654 13.095 - 13.194: 97.4314% ( 1) 00:07:39.654 13.194 - 13.292: 97.4563% ( 4) 00:07:39.654 13.292 - 13.391: 97.4687% ( 2) 00:07:39.654 13.391 - 13.489: 97.4935% ( 4) 00:07:39.654 13.489 - 13.588: 97.5245% ( 5) 00:07:39.654 13.588 - 13.686: 97.5617% ( 6) 00:07:39.654 13.686 - 13.785: 97.6114% ( 8) 00:07:39.654 13.785 - 13.883: 97.6424% ( 5) 00:07:39.654 13.883 - 13.982: 97.7044% ( 10) 00:07:39.654 13.982 - 14.080: 97.7975% ( 15) 00:07:39.654 14.080 - 14.178: 98.1139% ( 51) 00:07:39.654 14.178 - 14.277: 98.3807% ( 43) 00:07:39.654 14.277 - 14.375: 98.6227% ( 39) 00:07:39.654 14.375 - 14.474: 98.7219% ( 16) 00:07:39.654 14.474 - 14.572: 98.8274% ( 17) 00:07:39.654 14.572 - 14.671: 98.9205% ( 15) 00:07:39.654 14.671 - 14.769: 98.9825% ( 10) 00:07:39.654 14.769 - 14.868: 99.0197% ( 6) 00:07:39.654 14.868 - 14.966: 99.0508% ( 5) 00:07:39.654 14.966 - 15.065: 99.1128% ( 10) 00:07:39.654 15.065 - 15.163: 99.1624% ( 8) 00:07:39.654 15.163 - 15.262: 99.1934% ( 5) 00:07:39.654 15.262 - 15.360: 99.2307% ( 6) 00:07:39.654 15.360 - 15.458: 99.2493% ( 3) 00:07:39.654 15.458 - 15.557: 99.2989% ( 8) 00:07:39.654 15.557 - 15.655: 99.3175% ( 3) 00:07:39.654 15.754 - 15.852: 99.3237% ( 1) 00:07:39.654 15.852 - 15.951: 99.3424% ( 3) 00:07:39.654 15.951 - 16.049: 99.3610% ( 3) 00:07:39.654 16.049 - 16.148: 99.3796% ( 3) 00:07:39.654 16.148 - 16.246: 99.4044% ( 4) 00:07:39.654 16.246 - 16.345: 99.4230% ( 3) 00:07:39.654 16.345 - 16.443: 99.4478% ( 4) 00:07:39.654 16.443 - 16.542: 99.4664% ( 3) 00:07:39.654 16.640 - 16.738: 99.4726% ( 1) 00:07:39.654 16.738 - 16.837: 99.4788% ( 1) 00:07:39.654 17.034 - 17.132: 99.5099% ( 5) 00:07:39.654 17.132 - 17.231: 99.5347% ( 4) 00:07:39.654 17.231 - 17.329: 99.5471% ( 2) 00:07:39.654 17.329 - 17.428: 99.5595% ( 2) 00:07:39.654 17.526 - 17.625: 99.5781% ( 3) 00:07:39.654 17.625 - 17.723: 99.6029% ( 4) 00:07:39.654 17.723 - 17.822: 99.6215% ( 3) 00:07:39.654 17.822 - 17.920: 99.6277% ( 1) 00:07:39.654 18.215 - 18.314: 99.6339% ( 1) 00:07:39.654 18.412 - 18.511: 99.6464% ( 2) 00:07:39.654 18.511 - 18.609: 99.6526% ( 1) 00:07:39.654 18.708 - 18.806: 99.6650% ( 2) 00:07:39.654 18.806 - 18.905: 99.6712% ( 1) 00:07:39.654 18.905 - 19.003: 99.6774% ( 1) 00:07:39.654 19.003 - 19.102: 99.6836% ( 1) 00:07:39.654 19.200 - 19.298: 99.6898% ( 1) 00:07:39.654 19.692 - 19.791: 99.7022% ( 2) 00:07:39.654 19.791 - 19.889: 99.7208% ( 3) 00:07:39.654 20.775 - 20.874: 99.7270% ( 1) 00:07:39.654 22.055 - 22.154: 99.7332% ( 1) 00:07:39.654 22.154 - 22.252: 99.7580% ( 4) 00:07:39.654 22.252 - 22.351: 99.7891% ( 5) 00:07:39.654 22.351 - 22.449: 99.8077% ( 3) 00:07:39.654 22.449 - 22.548: 99.8263% ( 3) 00:07:39.654 22.548 - 22.646: 99.8387% ( 2) 00:07:39.654 22.646 - 22.745: 99.8573% ( 3) 00:07:39.654 22.745 - 22.843: 99.8697% ( 2) 00:07:39.654 22.843 - 22.942: 99.8759% ( 1) 00:07:39.654 23.138 - 23.237: 99.8821% ( 1) 00:07:39.654 23.335 - 23.434: 99.8945% ( 2) 00:07:39.654 23.434 - 23.532: 99.9007% ( 1) 00:07:39.654 23.532 - 23.631: 99.9069% ( 1) 00:07:39.654 23.729 - 23.828: 99.9131% ( 1) 00:07:39.654 24.025 - 24.123: 99.9193% ( 1) 00:07:39.654 24.418 - 24.517: 99.9255% ( 1) 00:07:39.654 24.517 - 24.615: 99.9318% ( 1) 00:07:39.654 25.206 - 25.403: 99.9380% ( 1) 00:07:39.654 27.963 - 28.160: 99.9442% ( 1) 00:07:39.654 28.751 - 28.948: 99.9504% ( 1) 00:07:39.654 29.932 - 30.129: 99.9566% ( 1) 00:07:39.654 31.311 - 31.508: 99.9628% ( 1) 00:07:39.654 40.369 - 40.566: 99.9690% ( 1) 00:07:39.654 40.960 - 41.157: 99.9752% ( 1) 00:07:39.654 45.489 - 45.686: 99.9814% ( 1) 00:07:39.654 49.428 - 49.625: 99.9876% ( 1) 00:07:39.654 79.163 - 79.557: 99.9938% ( 1) 00:07:39.654 139.422 - 140.209: 100.0000% ( 1) 00:07:39.654 00:07:39.654 00:07:39.654 real 0m1.191s 00:07:39.654 user 0m1.061s 00:07:39.654 sys 0m0.087s 00:07:39.654 04:57:59 nvme.nvme_overhead -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:39.654 ************************************ 00:07:39.654 END TEST nvme_overhead 00:07:39.654 ************************************ 00:07:39.654 04:57:59 nvme.nvme_overhead -- common/autotest_common.sh@10 -- # set +x 00:07:39.654 04:57:59 nvme -- nvme/nvme.sh@93 -- # run_test nvme_arbitration /home/vagrant/spdk_repo/spdk/build/examples/arbitration -t 3 -i 0 00:07:39.654 04:57:59 nvme -- common/autotest_common.sh@1105 -- # '[' 6 -le 1 ']' 00:07:39.654 04:57:59 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:39.654 04:57:59 nvme -- common/autotest_common.sh@10 -- # set +x 00:07:39.654 ************************************ 00:07:39.654 START TEST nvme_arbitration 00:07:39.654 ************************************ 00:07:39.654 04:57:59 nvme.nvme_arbitration -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/arbitration -t 3 -i 0 00:07:42.975 Initializing NVMe Controllers 00:07:42.975 Attached to 0000:00:10.0 00:07:42.975 Attached to 0000:00:11.0 00:07:42.975 Attached to 0000:00:13.0 00:07:42.975 Attached to 0000:00:12.0 00:07:42.975 Associating QEMU NVMe Ctrl (12340 ) with lcore 0 00:07:42.975 Associating QEMU NVMe Ctrl (12341 ) with lcore 1 00:07:42.975 Associating QEMU NVMe Ctrl (12343 ) with lcore 2 00:07:42.975 Associating QEMU NVMe Ctrl (12342 ) with lcore 3 00:07:42.975 Associating QEMU NVMe Ctrl (12342 ) with lcore 0 00:07:42.975 Associating QEMU NVMe Ctrl (12342 ) with lcore 1 00:07:42.975 /home/vagrant/spdk_repo/spdk/build/examples/arbitration run with configuration: 00:07:42.975 /home/vagrant/spdk_repo/spdk/build/examples/arbitration -q 64 -s 131072 -w randrw -M 50 -l 0 -t 3 -c 0xf -m 0 -a 0 -b 0 -n 100000 -i 0 00:07:42.975 Initialization complete. Launching workers. 00:07:42.975 Starting thread on core 1 with urgent priority queue 00:07:42.975 Starting thread on core 2 with urgent priority queue 00:07:42.975 Starting thread on core 3 with urgent priority queue 00:07:42.975 Starting thread on core 0 with urgent priority queue 00:07:42.975 QEMU NVMe Ctrl (12340 ) core 0: 6475.67 IO/s 15.44 secs/100000 ios 00:07:42.975 QEMU NVMe Ctrl (12342 ) core 0: 6506.67 IO/s 15.37 secs/100000 ios 00:07:42.975 QEMU NVMe Ctrl (12341 ) core 1: 6381.00 IO/s 15.67 secs/100000 ios 00:07:42.975 QEMU NVMe Ctrl (12342 ) core 1: 6376.00 IO/s 15.68 secs/100000 ios 00:07:42.975 QEMU NVMe Ctrl (12343 ) core 2: 5696.00 IO/s 17.56 secs/100000 ios 00:07:42.975 QEMU NVMe Ctrl (12342 ) core 3: 6327.67 IO/s 15.80 secs/100000 ios 00:07:42.975 ======================================================== 00:07:42.975 00:07:42.975 00:07:42.975 real 0m3.230s 00:07:42.975 user 0m9.044s 00:07:42.975 sys 0m0.104s 00:07:42.975 04:58:02 nvme.nvme_arbitration -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:42.975 ************************************ 00:07:42.975 04:58:02 nvme.nvme_arbitration -- common/autotest_common.sh@10 -- # set +x 00:07:42.975 END TEST nvme_arbitration 00:07:42.975 ************************************ 00:07:42.975 04:58:02 nvme -- nvme/nvme.sh@94 -- # run_test nvme_single_aen /home/vagrant/spdk_repo/spdk/test/nvme/aer/aer -T -i 0 00:07:42.975 04:58:02 nvme -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:07:42.975 04:58:02 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:42.975 04:58:02 nvme -- common/autotest_common.sh@10 -- # set +x 00:07:42.975 ************************************ 00:07:42.975 START TEST nvme_single_aen 00:07:42.975 ************************************ 00:07:42.975 04:58:02 nvme.nvme_single_aen -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/aer/aer -T -i 0 00:07:42.975 Asynchronous Event Request test 00:07:42.975 Attached to 0000:00:10.0 00:07:42.975 Attached to 0000:00:11.0 00:07:42.975 Attached to 0000:00:13.0 00:07:42.975 Attached to 0000:00:12.0 00:07:42.975 Reset controller to setup AER completions for this process 00:07:42.975 Registering asynchronous event callbacks... 00:07:42.975 Getting orig temperature thresholds of all controllers 00:07:42.975 0000:00:10.0: original temperature threshold: 343 Kelvin (70 Celsius) 00:07:42.975 0000:00:11.0: original temperature threshold: 343 Kelvin (70 Celsius) 00:07:42.975 0000:00:13.0: original temperature threshold: 343 Kelvin (70 Celsius) 00:07:42.975 0000:00:12.0: original temperature threshold: 343 Kelvin (70 Celsius) 00:07:42.975 Setting all controllers temperature threshold low to trigger AER 00:07:42.975 Waiting for all controllers temperature threshold to be set lower 00:07:42.975 0000:00:10.0: aer_cb for log page 2, aen_event_type: 0x01, aen_event_info: 0x01 00:07:42.975 aer_cb - Resetting Temp Threshold for device: 0000:00:10.0 00:07:42.975 0000:00:11.0: aer_cb for log page 2, aen_event_type: 0x01, aen_event_info: 0x01 00:07:42.975 aer_cb - Resetting Temp Threshold for device: 0000:00:11.0 00:07:42.975 0000:00:13.0: aer_cb for log page 2, aen_event_type: 0x01, aen_event_info: 0x01 00:07:42.975 aer_cb - Resetting Temp Threshold for device: 0000:00:13.0 00:07:42.975 0000:00:12.0: aer_cb for log page 2, aen_event_type: 0x01, aen_event_info: 0x01 00:07:42.975 aer_cb - Resetting Temp Threshold for device: 0000:00:12.0 00:07:42.975 Waiting for all controllers to trigger AER and reset threshold 00:07:42.975 0000:00:10.0: Current Temperature: 323 Kelvin (50 Celsius) 00:07:42.975 0000:00:11.0: Current Temperature: 323 Kelvin (50 Celsius) 00:07:42.975 0000:00:13.0: Current Temperature: 323 Kelvin (50 Celsius) 00:07:42.975 0000:00:12.0: Current Temperature: 323 Kelvin (50 Celsius) 00:07:42.975 Cleaning up... 00:07:43.234 00:07:43.234 real 0m0.207s 00:07:43.234 user 0m0.064s 00:07:43.234 sys 0m0.098s 00:07:43.234 ************************************ 00:07:43.234 END TEST nvme_single_aen 00:07:43.234 ************************************ 00:07:43.234 04:58:02 nvme.nvme_single_aen -- common/autotest_common.sh@1130 -- # xtrace_disable 00:07:43.234 04:58:02 nvme.nvme_single_aen -- common/autotest_common.sh@10 -- # set +x 00:07:43.234 04:58:03 nvme -- nvme/nvme.sh@95 -- # run_test nvme_doorbell_aers nvme_doorbell_aers 00:07:43.234 04:58:03 nvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:07:43.234 04:58:03 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:07:43.234 04:58:03 nvme -- common/autotest_common.sh@10 -- # set +x 00:07:43.234 ************************************ 00:07:43.234 START TEST nvme_doorbell_aers 00:07:43.234 ************************************ 00:07:43.234 04:58:03 nvme.nvme_doorbell_aers -- common/autotest_common.sh@1129 -- # nvme_doorbell_aers 00:07:43.234 04:58:03 nvme.nvme_doorbell_aers -- nvme/nvme.sh@70 -- # bdfs=() 00:07:43.234 04:58:03 nvme.nvme_doorbell_aers -- nvme/nvme.sh@70 -- # local bdfs bdf 00:07:43.234 04:58:03 nvme.nvme_doorbell_aers -- nvme/nvme.sh@71 -- # bdfs=($(get_nvme_bdfs)) 00:07:43.234 04:58:03 nvme.nvme_doorbell_aers -- nvme/nvme.sh@71 -- # get_nvme_bdfs 00:07:43.234 04:58:03 nvme.nvme_doorbell_aers -- common/autotest_common.sh@1498 -- # bdfs=() 00:07:43.234 04:58:03 nvme.nvme_doorbell_aers -- common/autotest_common.sh@1498 -- # local bdfs 00:07:43.234 04:58:03 nvme.nvme_doorbell_aers -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:07:43.234 04:58:03 nvme.nvme_doorbell_aers -- common/autotest_common.sh@1499 -- # /home/vagrant/spdk_repo/spdk/scripts/gen_nvme.sh 00:07:43.234 04:58:03 nvme.nvme_doorbell_aers -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:07:43.234 04:58:03 nvme.nvme_doorbell_aers -- common/autotest_common.sh@1500 -- # (( 4 == 0 )) 00:07:43.234 04:58:03 nvme.nvme_doorbell_aers -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:00:10.0 0000:00:11.0 0000:00:12.0 0000:00:13.0 00:07:43.234 04:58:03 nvme.nvme_doorbell_aers -- nvme/nvme.sh@72 -- # for bdf in "${bdfs[@]}" 00:07:43.234 04:58:03 nvme.nvme_doorbell_aers -- nvme/nvme.sh@73 -- # timeout --preserve-status 10 /home/vagrant/spdk_repo/spdk/test/nvme/doorbell_aers/doorbell_aers -r 'trtype:PCIe traddr:0000:00:10.0' 00:07:43.492 [2024-12-15 04:58:03.402775] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:07:53.492 Executing: test_write_invalid_db 00:07:53.492 Waiting for AER completion... 00:07:53.492 Failure: test_write_invalid_db 00:07:53.492 00:07:53.492 Executing: test_invalid_db_write_overflow_sq 00:07:53.492 Waiting for AER completion... 00:07:53.492 Failure: test_invalid_db_write_overflow_sq 00:07:53.492 00:07:53.492 Executing: test_invalid_db_write_overflow_cq 00:07:53.492 Waiting for AER completion... 00:07:53.492 Failure: test_invalid_db_write_overflow_cq 00:07:53.492 00:07:53.492 04:58:13 nvme.nvme_doorbell_aers -- nvme/nvme.sh@72 -- # for bdf in "${bdfs[@]}" 00:07:53.492 04:58:13 nvme.nvme_doorbell_aers -- nvme/nvme.sh@73 -- # timeout --preserve-status 10 /home/vagrant/spdk_repo/spdk/test/nvme/doorbell_aers/doorbell_aers -r 'trtype:PCIe traddr:0000:00:11.0' 00:07:53.492 [2024-12-15 04:58:13.445209] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:03.460 Executing: test_write_invalid_db 00:08:03.460 Waiting for AER completion... 00:08:03.460 Failure: test_write_invalid_db 00:08:03.460 00:08:03.460 Executing: test_invalid_db_write_overflow_sq 00:08:03.460 Waiting for AER completion... 00:08:03.460 Failure: test_invalid_db_write_overflow_sq 00:08:03.460 00:08:03.460 Executing: test_invalid_db_write_overflow_cq 00:08:03.460 Waiting for AER completion... 00:08:03.460 Failure: test_invalid_db_write_overflow_cq 00:08:03.460 00:08:03.460 04:58:23 nvme.nvme_doorbell_aers -- nvme/nvme.sh@72 -- # for bdf in "${bdfs[@]}" 00:08:03.460 04:58:23 nvme.nvme_doorbell_aers -- nvme/nvme.sh@73 -- # timeout --preserve-status 10 /home/vagrant/spdk_repo/spdk/test/nvme/doorbell_aers/doorbell_aers -r 'trtype:PCIe traddr:0000:00:12.0' 00:08:03.460 [2024-12-15 04:58:23.462499] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:13.423 Executing: test_write_invalid_db 00:08:13.423 Waiting for AER completion... 00:08:13.423 Failure: test_write_invalid_db 00:08:13.423 00:08:13.423 Executing: test_invalid_db_write_overflow_sq 00:08:13.423 Waiting for AER completion... 00:08:13.423 Failure: test_invalid_db_write_overflow_sq 00:08:13.423 00:08:13.423 Executing: test_invalid_db_write_overflow_cq 00:08:13.423 Waiting for AER completion... 00:08:13.423 Failure: test_invalid_db_write_overflow_cq 00:08:13.423 00:08:13.423 04:58:33 nvme.nvme_doorbell_aers -- nvme/nvme.sh@72 -- # for bdf in "${bdfs[@]}" 00:08:13.423 04:58:33 nvme.nvme_doorbell_aers -- nvme/nvme.sh@73 -- # timeout --preserve-status 10 /home/vagrant/spdk_repo/spdk/test/nvme/doorbell_aers/doorbell_aers -r 'trtype:PCIe traddr:0000:00:13.0' 00:08:13.423 [2024-12-15 04:58:33.478411] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:23.387 Executing: test_write_invalid_db 00:08:23.387 Waiting for AER completion... 00:08:23.387 Failure: test_write_invalid_db 00:08:23.387 00:08:23.387 Executing: test_invalid_db_write_overflow_sq 00:08:23.387 Waiting for AER completion... 00:08:23.387 Failure: test_invalid_db_write_overflow_sq 00:08:23.387 00:08:23.387 Executing: test_invalid_db_write_overflow_cq 00:08:23.387 Waiting for AER completion... 00:08:23.387 Failure: test_invalid_db_write_overflow_cq 00:08:23.387 00:08:23.387 00:08:23.387 real 0m40.181s 00:08:23.387 user 0m34.060s 00:08:23.387 sys 0m5.766s 00:08:23.387 04:58:43 nvme.nvme_doorbell_aers -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:23.387 04:58:43 nvme.nvme_doorbell_aers -- common/autotest_common.sh@10 -- # set +x 00:08:23.387 ************************************ 00:08:23.387 END TEST nvme_doorbell_aers 00:08:23.387 ************************************ 00:08:23.387 04:58:43 nvme -- nvme/nvme.sh@97 -- # uname 00:08:23.387 04:58:43 nvme -- nvme/nvme.sh@97 -- # '[' Linux '!=' FreeBSD ']' 00:08:23.387 04:58:43 nvme -- nvme/nvme.sh@98 -- # run_test nvme_multi_aen /home/vagrant/spdk_repo/spdk/test/nvme/aer/aer -m -T -i 0 00:08:23.387 04:58:43 nvme -- common/autotest_common.sh@1105 -- # '[' 6 -le 1 ']' 00:08:23.387 04:58:43 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:23.387 04:58:43 nvme -- common/autotest_common.sh@10 -- # set +x 00:08:23.387 ************************************ 00:08:23.387 START TEST nvme_multi_aen 00:08:23.387 ************************************ 00:08:23.387 04:58:43 nvme.nvme_multi_aen -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/aer/aer -m -T -i 0 00:08:23.645 [2024-12-15 04:58:43.540339] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:23.645 [2024-12-15 04:58:43.540400] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:23.645 [2024-12-15 04:58:43.540412] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:23.645 [2024-12-15 04:58:43.541665] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:23.645 [2024-12-15 04:58:43.541694] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:23.645 [2024-12-15 04:58:43.541702] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:23.645 [2024-12-15 04:58:43.542653] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:23.645 [2024-12-15 04:58:43.542678] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:23.645 [2024-12-15 04:58:43.542685] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:23.645 [2024-12-15 04:58:43.543592] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:23.645 [2024-12-15 04:58:43.543617] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:23.645 [2024-12-15 04:58:43.543625] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 76845) is not found. Dropping the request. 00:08:23.646 Child process pid: 77371 00:08:23.646 [Child] Asynchronous Event Request test 00:08:23.646 [Child] Attached to 0000:00:10.0 00:08:23.646 [Child] Attached to 0000:00:11.0 00:08:23.646 [Child] Attached to 0000:00:13.0 00:08:23.646 [Child] Attached to 0000:00:12.0 00:08:23.646 [Child] Registering asynchronous event callbacks... 00:08:23.646 [Child] Getting orig temperature thresholds of all controllers 00:08:23.646 [Child] 0000:00:10.0: original temperature threshold: 343 Kelvin (70 Celsius) 00:08:23.646 [Child] 0000:00:11.0: original temperature threshold: 343 Kelvin (70 Celsius) 00:08:23.646 [Child] 0000:00:13.0: original temperature threshold: 343 Kelvin (70 Celsius) 00:08:23.646 [Child] 0000:00:12.0: original temperature threshold: 343 Kelvin (70 Celsius) 00:08:23.646 [Child] Waiting for all controllers to trigger AER and reset threshold 00:08:23.646 [Child] 0000:00:10.0: aer_cb for log page 2, aen_event_type: 0x01, aen_event_info: 0x01 00:08:23.646 [Child] 0000:00:11.0: aer_cb for log page 2, aen_event_type: 0x01, aen_event_info: 0x01 00:08:23.646 [Child] 0000:00:13.0: aer_cb for log page 2, aen_event_type: 0x01, aen_event_info: 0x01 00:08:23.646 [Child] 0000:00:12.0: aer_cb for log page 2, aen_event_type: 0x01, aen_event_info: 0x01 00:08:23.646 [Child] 0000:00:10.0: Current Temperature: 323 Kelvin (50 Celsius) 00:08:23.646 [Child] 0000:00:11.0: Current Temperature: 323 Kelvin (50 Celsius) 00:08:23.646 [Child] 0000:00:13.0: Current Temperature: 323 Kelvin (50 Celsius) 00:08:23.646 [Child] 0000:00:12.0: Current Temperature: 323 Kelvin (50 Celsius) 00:08:23.646 [Child] Cleaning up... 00:08:23.646 Asynchronous Event Request test 00:08:23.646 Attached to 0000:00:10.0 00:08:23.646 Attached to 0000:00:11.0 00:08:23.646 Attached to 0000:00:13.0 00:08:23.646 Attached to 0000:00:12.0 00:08:23.646 Reset controller to setup AER completions for this process 00:08:23.646 Registering asynchronous event callbacks... 00:08:23.646 Getting orig temperature thresholds of all controllers 00:08:23.646 0000:00:10.0: original temperature threshold: 343 Kelvin (70 Celsius) 00:08:23.646 0000:00:11.0: original temperature threshold: 343 Kelvin (70 Celsius) 00:08:23.646 0000:00:13.0: original temperature threshold: 343 Kelvin (70 Celsius) 00:08:23.646 0000:00:12.0: original temperature threshold: 343 Kelvin (70 Celsius) 00:08:23.646 Setting all controllers temperature threshold low to trigger AER 00:08:23.646 Waiting for all controllers temperature threshold to be set lower 00:08:23.646 0000:00:10.0: aer_cb for log page 2, aen_event_type: 0x01, aen_event_info: 0x01 00:08:23.646 aer_cb - Resetting Temp Threshold for device: 0000:00:10.0 00:08:23.646 0000:00:11.0: aer_cb for log page 2, aen_event_type: 0x01, aen_event_info: 0x01 00:08:23.646 aer_cb - Resetting Temp Threshold for device: 0000:00:11.0 00:08:23.646 0000:00:13.0: aer_cb for log page 2, aen_event_type: 0x01, aen_event_info: 0x01 00:08:23.646 aer_cb - Resetting Temp Threshold for device: 0000:00:13.0 00:08:23.646 0000:00:12.0: aer_cb for log page 2, aen_event_type: 0x01, aen_event_info: 0x01 00:08:23.646 aer_cb - Resetting Temp Threshold for device: 0000:00:12.0 00:08:23.646 Waiting for all controllers to trigger AER and reset threshold 00:08:23.646 0000:00:10.0: Current Temperature: 323 Kelvin (50 Celsius) 00:08:23.646 0000:00:11.0: Current Temperature: 323 Kelvin (50 Celsius) 00:08:23.646 0000:00:13.0: Current Temperature: 323 Kelvin (50 Celsius) 00:08:23.646 0000:00:12.0: Current Temperature: 323 Kelvin (50 Celsius) 00:08:23.646 Cleaning up... 00:08:23.646 00:08:23.646 real 0m0.380s 00:08:23.646 user 0m0.142s 00:08:23.646 sys 0m0.152s 00:08:23.646 04:58:43 nvme.nvme_multi_aen -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:23.646 ************************************ 00:08:23.646 END TEST nvme_multi_aen 00:08:23.646 ************************************ 00:08:23.646 04:58:43 nvme.nvme_multi_aen -- common/autotest_common.sh@10 -- # set +x 00:08:23.646 04:58:43 nvme -- nvme/nvme.sh@99 -- # run_test nvme_startup /home/vagrant/spdk_repo/spdk/test/nvme/startup/startup -t 1000000 00:08:23.646 04:58:43 nvme -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:08:23.646 04:58:43 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:23.646 04:58:43 nvme -- common/autotest_common.sh@10 -- # set +x 00:08:23.904 ************************************ 00:08:23.904 START TEST nvme_startup 00:08:23.904 ************************************ 00:08:23.904 04:58:43 nvme.nvme_startup -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/startup/startup -t 1000000 00:08:23.904 Initializing NVMe Controllers 00:08:23.904 Attached to 0000:00:10.0 00:08:23.904 Attached to 0000:00:11.0 00:08:23.904 Attached to 0000:00:13.0 00:08:23.904 Attached to 0000:00:12.0 00:08:23.904 Initialization complete. 00:08:23.904 Time used:127877.969 (us). 00:08:23.904 00:08:23.904 real 0m0.177s 00:08:23.904 user 0m0.050s 00:08:23.904 sys 0m0.087s 00:08:23.904 04:58:43 nvme.nvme_startup -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:23.904 04:58:43 nvme.nvme_startup -- common/autotest_common.sh@10 -- # set +x 00:08:23.904 ************************************ 00:08:23.904 END TEST nvme_startup 00:08:23.904 ************************************ 00:08:23.904 04:58:43 nvme -- nvme/nvme.sh@100 -- # run_test nvme_multi_secondary nvme_multi_secondary 00:08:23.904 04:58:43 nvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:23.904 04:58:43 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:23.904 04:58:43 nvme -- common/autotest_common.sh@10 -- # set +x 00:08:23.904 ************************************ 00:08:23.904 START TEST nvme_multi_secondary 00:08:23.904 ************************************ 00:08:23.904 04:58:43 nvme.nvme_multi_secondary -- common/autotest_common.sh@1129 -- # nvme_multi_secondary 00:08:23.904 04:58:43 nvme.nvme_multi_secondary -- nvme/nvme.sh@52 -- # pid0=77422 00:08:23.904 04:58:43 nvme.nvme_multi_secondary -- nvme/nvme.sh@54 -- # pid1=77423 00:08:23.904 04:58:43 nvme.nvme_multi_secondary -- nvme/nvme.sh@55 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_perf -i 0 -q 16 -w read -o 4096 -t 3 -c 0x4 00:08:23.904 04:58:43 nvme.nvme_multi_secondary -- nvme/nvme.sh@51 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_perf -i 0 -q 16 -w read -o 4096 -t 5 -c 0x1 00:08:23.904 04:58:43 nvme.nvme_multi_secondary -- nvme/nvme.sh@53 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_perf -i 0 -q 16 -w read -o 4096 -t 3 -c 0x2 00:08:27.185 Initializing NVMe Controllers 00:08:27.185 Attached to NVMe Controller at 0000:00:10.0 [1b36:0010] 00:08:27.185 Attached to NVMe Controller at 0000:00:11.0 [1b36:0010] 00:08:27.185 Attached to NVMe Controller at 0000:00:13.0 [1b36:0010] 00:08:27.185 Attached to NVMe Controller at 0000:00:12.0 [1b36:0010] 00:08:27.185 Associating PCIE (0000:00:10.0) NSID 1 with lcore 1 00:08:27.185 Associating PCIE (0000:00:11.0) NSID 1 with lcore 1 00:08:27.185 Associating PCIE (0000:00:13.0) NSID 1 with lcore 1 00:08:27.185 Associating PCIE (0000:00:12.0) NSID 1 with lcore 1 00:08:27.185 Associating PCIE (0000:00:12.0) NSID 2 with lcore 1 00:08:27.185 Associating PCIE (0000:00:12.0) NSID 3 with lcore 1 00:08:27.185 Initialization complete. Launching workers. 00:08:27.185 ======================================================== 00:08:27.185 Latency(us) 00:08:27.185 Device Information : IOPS MiB/s Average min max 00:08:27.185 PCIE (0000:00:10.0) NSID 1 from core 1: 5159.89 20.16 3099.26 817.88 11848.77 00:08:27.185 PCIE (0000:00:11.0) NSID 1 from core 1: 5159.89 20.16 3100.40 830.51 12005.56 00:08:27.185 PCIE (0000:00:13.0) NSID 1 from core 1: 5159.89 20.16 3100.87 835.37 12475.78 00:08:27.185 PCIE (0000:00:12.0) NSID 1 from core 1: 5159.89 20.16 3101.24 819.29 11890.10 00:08:27.185 PCIE (0000:00:12.0) NSID 2 from core 1: 5159.89 20.16 3101.50 826.19 12062.50 00:08:27.185 PCIE (0000:00:12.0) NSID 3 from core 1: 5159.89 20.16 3102.05 846.17 12077.38 00:08:27.185 ======================================================== 00:08:27.185 Total : 30959.34 120.93 3100.89 817.88 12475.78 00:08:27.185 00:08:27.443 Initializing NVMe Controllers 00:08:27.443 Attached to NVMe Controller at 0000:00:10.0 [1b36:0010] 00:08:27.443 Attached to NVMe Controller at 0000:00:11.0 [1b36:0010] 00:08:27.443 Attached to NVMe Controller at 0000:00:13.0 [1b36:0010] 00:08:27.443 Attached to NVMe Controller at 0000:00:12.0 [1b36:0010] 00:08:27.443 Associating PCIE (0000:00:10.0) NSID 1 with lcore 2 00:08:27.443 Associating PCIE (0000:00:11.0) NSID 1 with lcore 2 00:08:27.443 Associating PCIE (0000:00:13.0) NSID 1 with lcore 2 00:08:27.443 Associating PCIE (0000:00:12.0) NSID 1 with lcore 2 00:08:27.443 Associating PCIE (0000:00:12.0) NSID 2 with lcore 2 00:08:27.443 Associating PCIE (0000:00:12.0) NSID 3 with lcore 2 00:08:27.443 Initialization complete. Launching workers. 00:08:27.443 ======================================================== 00:08:27.443 Latency(us) 00:08:27.443 Device Information : IOPS MiB/s Average min max 00:08:27.443 PCIE (0000:00:10.0) NSID 1 from core 2: 1877.33 7.33 8519.71 1561.62 36095.38 00:08:27.443 PCIE (0000:00:11.0) NSID 1 from core 2: 1877.33 7.33 8522.58 1755.29 33903.61 00:08:27.443 PCIE (0000:00:13.0) NSID 1 from core 2: 1877.33 7.33 8522.08 1682.95 30121.06 00:08:27.443 PCIE (0000:00:12.0) NSID 1 from core 2: 1877.33 7.33 8523.01 1731.26 25471.66 00:08:27.443 PCIE (0000:00:12.0) NSID 2 from core 2: 1877.33 7.33 8523.28 1676.47 28538.32 00:08:27.443 PCIE (0000:00:12.0) NSID 3 from core 2: 1877.33 7.33 8523.45 1663.17 34721.58 00:08:27.443 ======================================================== 00:08:27.443 Total : 11263.97 44.00 8522.35 1561.62 36095.38 00:08:27.443 00:08:27.443 04:58:47 nvme.nvme_multi_secondary -- nvme/nvme.sh@56 -- # wait 77422 00:08:29.341 Initializing NVMe Controllers 00:08:29.341 Attached to NVMe Controller at 0000:00:10.0 [1b36:0010] 00:08:29.341 Attached to NVMe Controller at 0000:00:11.0 [1b36:0010] 00:08:29.341 Attached to NVMe Controller at 0000:00:13.0 [1b36:0010] 00:08:29.341 Attached to NVMe Controller at 0000:00:12.0 [1b36:0010] 00:08:29.341 Associating PCIE (0000:00:10.0) NSID 1 with lcore 0 00:08:29.341 Associating PCIE (0000:00:11.0) NSID 1 with lcore 0 00:08:29.341 Associating PCIE (0000:00:13.0) NSID 1 with lcore 0 00:08:29.341 Associating PCIE (0000:00:12.0) NSID 1 with lcore 0 00:08:29.341 Associating PCIE (0000:00:12.0) NSID 2 with lcore 0 00:08:29.341 Associating PCIE (0000:00:12.0) NSID 3 with lcore 0 00:08:29.341 Initialization complete. Launching workers. 00:08:29.341 ======================================================== 00:08:29.341 Latency(us) 00:08:29.341 Device Information : IOPS MiB/s Average min max 00:08:29.341 PCIE (0000:00:10.0) NSID 1 from core 0: 6479.67 25.31 2467.68 756.17 10843.10 00:08:29.341 PCIE (0000:00:11.0) NSID 1 from core 0: 6479.67 25.31 2468.57 765.78 10424.98 00:08:29.341 PCIE (0000:00:13.0) NSID 1 from core 0: 6479.67 25.31 2468.39 774.24 11485.83 00:08:29.341 PCIE (0000:00:12.0) NSID 1 from core 0: 6479.67 25.31 2468.20 779.76 11408.23 00:08:29.341 PCIE (0000:00:12.0) NSID 2 from core 0: 6479.67 25.31 2468.00 789.12 11132.69 00:08:29.341 PCIE (0000:00:12.0) NSID 3 from core 0: 6479.67 25.31 2467.80 781.62 10550.67 00:08:29.341 ======================================================== 00:08:29.341 Total : 38878.03 151.87 2468.11 756.17 11485.83 00:08:29.341 00:08:29.341 04:58:49 nvme.nvme_multi_secondary -- nvme/nvme.sh@57 -- # wait 77423 00:08:29.341 04:58:49 nvme.nvme_multi_secondary -- nvme/nvme.sh@61 -- # pid0=77492 00:08:29.341 04:58:49 nvme.nvme_multi_secondary -- nvme/nvme.sh@60 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_perf -i 0 -q 16 -w read -o 4096 -t 3 -c 0x1 00:08:29.341 04:58:49 nvme.nvme_multi_secondary -- nvme/nvme.sh@63 -- # pid1=77493 00:08:29.341 04:58:49 nvme.nvme_multi_secondary -- nvme/nvme.sh@64 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_perf -i 0 -q 16 -w read -o 4096 -t 5 -c 0x4 00:08:29.341 04:58:49 nvme.nvme_multi_secondary -- nvme/nvme.sh@62 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_perf -i 0 -q 16 -w read -o 4096 -t 3 -c 0x2 00:08:32.623 Initializing NVMe Controllers 00:08:32.623 Attached to NVMe Controller at 0000:00:10.0 [1b36:0010] 00:08:32.623 Attached to NVMe Controller at 0000:00:11.0 [1b36:0010] 00:08:32.623 Attached to NVMe Controller at 0000:00:13.0 [1b36:0010] 00:08:32.623 Attached to NVMe Controller at 0000:00:12.0 [1b36:0010] 00:08:32.623 Associating PCIE (0000:00:10.0) NSID 1 with lcore 0 00:08:32.623 Associating PCIE (0000:00:11.0) NSID 1 with lcore 0 00:08:32.623 Associating PCIE (0000:00:13.0) NSID 1 with lcore 0 00:08:32.623 Associating PCIE (0000:00:12.0) NSID 1 with lcore 0 00:08:32.623 Associating PCIE (0000:00:12.0) NSID 2 with lcore 0 00:08:32.623 Associating PCIE (0000:00:12.0) NSID 3 with lcore 0 00:08:32.623 Initialization complete. Launching workers. 00:08:32.623 ======================================================== 00:08:32.623 Latency(us) 00:08:32.623 Device Information : IOPS MiB/s Average min max 00:08:32.623 PCIE (0000:00:10.0) NSID 1 from core 0: 5044.24 19.70 3170.27 1148.24 14285.96 00:08:32.623 PCIE (0000:00:11.0) NSID 1 from core 0: 5044.24 19.70 3171.64 1131.91 14160.11 00:08:32.623 PCIE (0000:00:13.0) NSID 1 from core 0: 5044.24 19.70 3171.77 1147.97 13923.26 00:08:32.623 PCIE (0000:00:12.0) NSID 1 from core 0: 5044.24 19.70 3171.90 1199.14 13762.69 00:08:32.623 PCIE (0000:00:12.0) NSID 2 from core 0: 5044.24 19.70 3172.00 1207.09 14458.98 00:08:32.623 PCIE (0000:00:12.0) NSID 3 from core 0: 5049.57 19.72 3168.74 1172.55 14013.46 00:08:32.623 ======================================================== 00:08:32.623 Total : 30270.78 118.25 3171.05 1131.91 14458.98 00:08:32.623 00:08:32.623 Initializing NVMe Controllers 00:08:32.623 Attached to NVMe Controller at 0000:00:10.0 [1b36:0010] 00:08:32.623 Attached to NVMe Controller at 0000:00:11.0 [1b36:0010] 00:08:32.623 Attached to NVMe Controller at 0000:00:13.0 [1b36:0010] 00:08:32.623 Attached to NVMe Controller at 0000:00:12.0 [1b36:0010] 00:08:32.623 Associating PCIE (0000:00:10.0) NSID 1 with lcore 1 00:08:32.623 Associating PCIE (0000:00:11.0) NSID 1 with lcore 1 00:08:32.623 Associating PCIE (0000:00:13.0) NSID 1 with lcore 1 00:08:32.623 Associating PCIE (0000:00:12.0) NSID 1 with lcore 1 00:08:32.623 Associating PCIE (0000:00:12.0) NSID 2 with lcore 1 00:08:32.623 Associating PCIE (0000:00:12.0) NSID 3 with lcore 1 00:08:32.623 Initialization complete. Launching workers. 00:08:32.623 ======================================================== 00:08:32.623 Latency(us) 00:08:32.623 Device Information : IOPS MiB/s Average min max 00:08:32.623 PCIE (0000:00:10.0) NSID 1 from core 1: 5343.64 20.87 2992.61 770.33 10439.65 00:08:32.623 PCIE (0000:00:11.0) NSID 1 from core 1: 5343.64 20.87 2993.65 786.22 11312.54 00:08:32.623 PCIE (0000:00:13.0) NSID 1 from core 1: 5343.64 20.87 2993.52 711.99 10379.03 00:08:32.623 PCIE (0000:00:12.0) NSID 1 from core 1: 5343.64 20.87 2993.41 628.91 11508.01 00:08:32.623 PCIE (0000:00:12.0) NSID 2 from core 1: 5343.64 20.87 2993.30 542.61 10710.28 00:08:32.623 PCIE (0000:00:12.0) NSID 3 from core 1: 5348.97 20.89 2990.19 456.84 11574.46 00:08:32.623 ======================================================== 00:08:32.623 Total : 32067.18 125.26 2992.78 456.84 11574.46 00:08:32.623 00:08:34.522 Initializing NVMe Controllers 00:08:34.522 Attached to NVMe Controller at 0000:00:10.0 [1b36:0010] 00:08:34.522 Attached to NVMe Controller at 0000:00:11.0 [1b36:0010] 00:08:34.522 Attached to NVMe Controller at 0000:00:13.0 [1b36:0010] 00:08:34.522 Attached to NVMe Controller at 0000:00:12.0 [1b36:0010] 00:08:34.522 Associating PCIE (0000:00:10.0) NSID 1 with lcore 2 00:08:34.522 Associating PCIE (0000:00:11.0) NSID 1 with lcore 2 00:08:34.522 Associating PCIE (0000:00:13.0) NSID 1 with lcore 2 00:08:34.522 Associating PCIE (0000:00:12.0) NSID 1 with lcore 2 00:08:34.522 Associating PCIE (0000:00:12.0) NSID 2 with lcore 2 00:08:34.522 Associating PCIE (0000:00:12.0) NSID 3 with lcore 2 00:08:34.522 Initialization complete. Launching workers. 00:08:34.522 ======================================================== 00:08:34.522 Latency(us) 00:08:34.522 Device Information : IOPS MiB/s Average min max 00:08:34.522 PCIE (0000:00:10.0) NSID 1 from core 2: 3807.51 14.87 4200.17 771.52 28501.72 00:08:34.522 PCIE (0000:00:11.0) NSID 1 from core 2: 3807.51 14.87 4201.67 701.58 31471.60 00:08:34.522 PCIE (0000:00:13.0) NSID 1 from core 2: 3807.51 14.87 4201.61 740.55 27823.75 00:08:34.522 PCIE (0000:00:12.0) NSID 1 from core 2: 3807.51 14.87 4201.77 637.61 28338.95 00:08:34.522 PCIE (0000:00:12.0) NSID 2 from core 2: 3807.51 14.87 4201.29 566.51 28603.49 00:08:34.522 PCIE (0000:00:12.0) NSID 3 from core 2: 3810.71 14.89 4195.16 468.15 30185.20 00:08:34.522 ======================================================== 00:08:34.522 Total : 22848.27 89.25 4200.28 468.15 31471.60 00:08:34.522 00:08:34.522 04:58:54 nvme.nvme_multi_secondary -- nvme/nvme.sh@65 -- # wait 77492 00:08:34.522 04:58:54 nvme.nvme_multi_secondary -- nvme/nvme.sh@66 -- # wait 77493 00:08:34.522 00:08:34.522 real 0m10.498s 00:08:34.522 user 0m18.261s 00:08:34.522 sys 0m0.566s 00:08:34.522 04:58:54 nvme.nvme_multi_secondary -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:34.522 04:58:54 nvme.nvme_multi_secondary -- common/autotest_common.sh@10 -- # set +x 00:08:34.522 ************************************ 00:08:34.522 END TEST nvme_multi_secondary 00:08:34.522 ************************************ 00:08:34.522 04:58:54 nvme -- nvme/nvme.sh@101 -- # trap - SIGINT SIGTERM EXIT 00:08:34.522 04:58:54 nvme -- nvme/nvme.sh@102 -- # kill_stub 00:08:34.522 04:58:54 nvme -- common/autotest_common.sh@1093 -- # [[ -e /proc/76456 ]] 00:08:34.522 04:58:54 nvme -- common/autotest_common.sh@1094 -- # kill 76456 00:08:34.522 04:58:54 nvme -- common/autotest_common.sh@1095 -- # wait 76456 00:08:34.522 [2024-12-15 04:58:54.523053] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.523150] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.523175] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.523199] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.523872] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.523950] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.523970] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.523991] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.524623] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.524681] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.524701] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.524724] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.525333] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.525392] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.525419] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 [2024-12-15 04:58:54.525475] nvme_pcie_common.c: 321:nvme_pcie_qpair_insert_pending_admin_request: *ERROR*: The owning process (pid 77369) is not found. Dropping the request. 00:08:34.522 04:58:54 nvme -- common/autotest_common.sh@1097 -- # rm -f /var/run/spdk_stub0 00:08:34.522 04:58:54 nvme -- common/autotest_common.sh@1101 -- # echo 2 00:08:34.522 04:58:54 nvme -- nvme/nvme.sh@105 -- # run_test bdev_nvme_reset_stuck_adm_cmd /home/vagrant/spdk_repo/spdk/test/nvme/nvme_reset_stuck_adm_cmd.sh 00:08:34.522 04:58:54 nvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:34.522 04:58:54 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:34.522 04:58:54 nvme -- common/autotest_common.sh@10 -- # set +x 00:08:34.522 ************************************ 00:08:34.522 START TEST bdev_nvme_reset_stuck_adm_cmd 00:08:34.522 ************************************ 00:08:34.522 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/nvme_reset_stuck_adm_cmd.sh 00:08:34.780 * Looking for test storage... 00:08:34.780 * Found test storage at /home/vagrant/spdk_repo/spdk/test/nvme 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1711 -- # lcov --version 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@333 -- # local ver1 ver1_l 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@334 -- # local ver2 ver2_l 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@336 -- # IFS=.-: 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@336 -- # read -ra ver1 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@337 -- # IFS=.-: 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@337 -- # read -ra ver2 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@338 -- # local 'op=<' 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@340 -- # ver1_l=2 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@341 -- # ver2_l=1 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@344 -- # case "$op" in 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@345 -- # : 1 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@364 -- # (( v = 0 )) 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@365 -- # decimal 1 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@353 -- # local d=1 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@355 -- # echo 1 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@365 -- # ver1[v]=1 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@366 -- # decimal 2 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@353 -- # local d=2 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@355 -- # echo 2 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@366 -- # ver2[v]=2 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- scripts/common.sh@368 -- # return 0 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:08:34.780 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.780 --rc genhtml_branch_coverage=1 00:08:34.780 --rc genhtml_function_coverage=1 00:08:34.780 --rc genhtml_legend=1 00:08:34.780 --rc geninfo_all_blocks=1 00:08:34.780 --rc geninfo_unexecuted_blocks=1 00:08:34.780 00:08:34.780 ' 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:08:34.780 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.780 --rc genhtml_branch_coverage=1 00:08:34.780 --rc genhtml_function_coverage=1 00:08:34.780 --rc genhtml_legend=1 00:08:34.780 --rc geninfo_all_blocks=1 00:08:34.780 --rc geninfo_unexecuted_blocks=1 00:08:34.780 00:08:34.780 ' 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:08:34.780 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.780 --rc genhtml_branch_coverage=1 00:08:34.780 --rc genhtml_function_coverage=1 00:08:34.780 --rc genhtml_legend=1 00:08:34.780 --rc geninfo_all_blocks=1 00:08:34.780 --rc geninfo_unexecuted_blocks=1 00:08:34.780 00:08:34.780 ' 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:08:34.780 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:08:34.780 --rc genhtml_branch_coverage=1 00:08:34.780 --rc genhtml_function_coverage=1 00:08:34.780 --rc genhtml_legend=1 00:08:34.780 --rc geninfo_all_blocks=1 00:08:34.780 --rc geninfo_unexecuted_blocks=1 00:08:34.780 00:08:34.780 ' 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@18 -- # ctrlr_name=nvme0 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@20 -- # err_injection_timeout=15000000 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@22 -- # test_timeout=5 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@25 -- # err_injection_sct=0 00:08:34.780 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@27 -- # err_injection_sc=1 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@29 -- # get_first_nvme_bdf 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1509 -- # bdfs=() 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1509 -- # local bdfs 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1510 -- # bdfs=($(get_nvme_bdfs)) 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1510 -- # get_nvme_bdfs 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1498 -- # bdfs=() 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1498 -- # local bdfs 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1499 -- # /home/vagrant/spdk_repo/spdk/scripts/gen_nvme.sh 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1500 -- # (( 4 == 0 )) 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:00:10.0 0000:00:11.0 0000:00:12.0 0000:00:13.0 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1512 -- # echo 0000:00:10.0 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@29 -- # bdf=0000:00:10.0 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@30 -- # '[' -z 0000:00:10.0 ']' 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@35 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0xF 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@36 -- # spdk_target_pid=77648 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@37 -- # trap 'killprocess "$spdk_target_pid"; exit 1' SIGINT SIGTERM EXIT 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@38 -- # waitforlisten 77648 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@835 -- # '[' -z 77648 ']' 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@840 -- # local max_retries=100 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:34.781 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@844 -- # xtrace_disable 00:08:34.781 04:58:54 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@10 -- # set +x 00:08:34.781 [2024-12-15 04:58:54.878718] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:08:34.781 [2024-12-15 04:58:54.878837] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid77648 ] 00:08:35.039 [2024-12-15 04:58:55.046557] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:35.039 [2024-12-15 04:58:55.068221] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:08:35.039 [2024-12-15 04:58:55.068527] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.039 [2024-12-15 04:58:55.068551] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 2 00:08:35.039 [2024-12-15 04:58:55.068573] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 3 00:08:35.603 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:08:35.603 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@868 -- # return 0 00:08:35.603 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@40 -- # rpc_cmd bdev_nvme_attach_controller -b nvme0 -t PCIe -a 0000:00:10.0 00:08:35.603 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@563 -- # xtrace_disable 00:08:35.603 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@10 -- # set +x 00:08:35.603 nvme0n1 00:08:35.603 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:08:35.603 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@41 -- # mktemp /tmp/err_inj_XXXXX.txt 00:08:35.603 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@41 -- # tmp_file=/tmp/err_inj_zTjqF.txt 00:08:35.603 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@44 -- # rpc_cmd bdev_nvme_add_error_injection -n nvme0 --cmd-type admin --opc 10 --timeout-in-us 15000000 --err-count 1 --sct 0 --sc 1 --do_not_submit 00:08:35.603 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@563 -- # xtrace_disable 00:08:35.603 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@10 -- # set +x 00:08:35.862 true 00:08:35.862 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:08:35.862 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@45 -- # date +%s 00:08:35.862 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@45 -- # start_time=1734238735 00:08:35.862 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@51 -- # get_feat_pid=77671 00:08:35.862 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@52 -- # trap 'killprocess "$get_feat_pid"; exit 1' SIGINT SIGTERM EXIT 00:08:35.862 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@55 -- # sleep 2 00:08:35.862 04:58:55 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@50 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_send_cmd -n nvme0 -t admin -r c2h -c CgAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAcAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA== 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@57 -- # rpc_cmd bdev_nvme_reset_controller nvme0 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@563 -- # xtrace_disable 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@10 -- # set +x 00:08:37.764 [2024-12-15 04:58:57.761916] nvme_ctrlr.c:1728:nvme_ctrlr_disconnect: *NOTICE*: [0000:00:10.0, 0] resetting controller 00:08:37.764 [2024-12-15 04:58:57.762425] nvme_qpair.c: 558:nvme_qpair_manual_complete_request: *NOTICE*: Command completed manually: 00:08:37.764 [2024-12-15 04:58:57.762484] nvme_qpair.c: 213:nvme_admin_qpair_print_command: *NOTICE*: GET FEATURES NUMBER OF QUEUES cid:0 cdw10:00000007 PRP1 0x0 PRP2 0x0 00:08:37.764 [2024-12-15 04:58:57.762501] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: INVALID OPCODE (00/01) qid:0 cid:0 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:08:37.764 [2024-12-15 04:58:57.764171] bdev_nvme.c:2287:bdev_nvme_reset_ctrlr_complete: *NOTICE*: [0000:00:10.0, 0] Resetting controller successful. 00:08:37.764 Waiting for RPC error injection (bdev_nvme_send_cmd) process PID: 77671 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@59 -- # echo 'Waiting for RPC error injection (bdev_nvme_send_cmd) process PID:' 77671 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@60 -- # wait 77671 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@61 -- # date +%s 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@61 -- # diff_time=2 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@62 -- # rpc_cmd bdev_nvme_detach_controller nvme0 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@563 -- # xtrace_disable 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@10 -- # set +x 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@64 -- # trap - SIGINT SIGTERM EXIT 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@67 -- # jq -r .cpl /tmp/err_inj_zTjqF.txt 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@67 -- # spdk_nvme_status=AAAAAAAAAAAAAAAAAAACAA== 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@68 -- # base64_decode_bits AAAAAAAAAAAAAAAAAAACAA== 1 255 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@11 -- # local bin_array status 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@13 -- # bin_array=($(base64 -d <(printf '%s' "$1") | hexdump -ve '/1 "0x%02x\n"')) 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@13 -- # hexdump -ve '/1 "0x%02x\n"' 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@13 -- # base64 -d /dev/fd/63 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@13 -- # printf %s AAAAAAAAAAAAAAAAAAACAA== 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@14 -- # status=2 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@15 -- # printf 0x%x 1 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@68 -- # nvme_status_sc=0x1 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@69 -- # base64_decode_bits AAAAAAAAAAAAAAAAAAACAA== 9 3 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@11 -- # local bin_array status 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@13 -- # bin_array=($(base64 -d <(printf '%s' "$1") | hexdump -ve '/1 "0x%02x\n"')) 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@13 -- # printf %s AAAAAAAAAAAAAAAAAAACAA== 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@13 -- # hexdump -ve '/1 "0x%02x\n"' 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@13 -- # base64 -d /dev/fd/63 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@14 -- # status=2 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@15 -- # printf 0x%x 0 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@69 -- # nvme_status_sct=0x0 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@71 -- # rm -f /tmp/err_inj_zTjqF.txt 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@73 -- # killprocess 77648 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@954 -- # '[' -z 77648 ']' 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@958 -- # kill -0 77648 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@959 -- # uname 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 77648 00:08:37.764 killing process with pid 77648 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@972 -- # echo 'killing process with pid 77648' 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@973 -- # kill 77648 00:08:37.764 04:58:57 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@978 -- # wait 77648 00:08:38.029 04:58:58 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@75 -- # (( err_injection_sc != nvme_status_sc || err_injection_sct != nvme_status_sct )) 00:08:38.029 04:58:58 nvme.bdev_nvme_reset_stuck_adm_cmd -- nvme/nvme_reset_stuck_adm_cmd.sh@79 -- # (( diff_time > test_timeout )) 00:08:38.029 00:08:38.029 real 0m3.523s 00:08:38.029 user 0m12.541s 00:08:38.029 sys 0m0.477s 00:08:38.029 04:58:58 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@1130 -- # xtrace_disable 00:08:38.029 04:58:58 nvme.bdev_nvme_reset_stuck_adm_cmd -- common/autotest_common.sh@10 -- # set +x 00:08:38.029 ************************************ 00:08:38.029 END TEST bdev_nvme_reset_stuck_adm_cmd 00:08:38.029 ************************************ 00:08:38.029 04:58:58 nvme -- nvme/nvme.sh@107 -- # [[ y == y ]] 00:08:38.029 04:58:58 nvme -- nvme/nvme.sh@108 -- # run_test nvme_fio nvme_fio_test 00:08:38.029 04:58:58 nvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:08:38.029 04:58:58 nvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:08:38.029 04:58:58 nvme -- common/autotest_common.sh@10 -- # set +x 00:08:38.290 ************************************ 00:08:38.290 START TEST nvme_fio 00:08:38.290 ************************************ 00:08:38.290 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1129 -- # nvme_fio_test 00:08:38.290 04:58:58 nvme.nvme_fio -- nvme/nvme.sh@31 -- # PLUGIN_DIR=/home/vagrant/spdk_repo/spdk/app/fio/nvme 00:08:38.290 04:58:58 nvme.nvme_fio -- nvme/nvme.sh@32 -- # ran_fio=false 00:08:38.290 04:58:58 nvme.nvme_fio -- nvme/nvme.sh@33 -- # get_nvme_bdfs 00:08:38.290 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1498 -- # bdfs=() 00:08:38.290 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1498 -- # local bdfs 00:08:38.290 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:08:38.290 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:08:38.290 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1499 -- # /home/vagrant/spdk_repo/spdk/scripts/gen_nvme.sh 00:08:38.290 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1500 -- # (( 4 == 0 )) 00:08:38.290 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:00:10.0 0000:00:11.0 0000:00:12.0 0000:00:13.0 00:08:38.290 04:58:58 nvme.nvme_fio -- nvme/nvme.sh@33 -- # bdfs=('0000:00:10.0' '0000:00:11.0' '0000:00:12.0' '0000:00:13.0') 00:08:38.290 04:58:58 nvme.nvme_fio -- nvme/nvme.sh@33 -- # local bdfs bdf 00:08:38.290 04:58:58 nvme.nvme_fio -- nvme/nvme.sh@34 -- # for bdf in "${bdfs[@]}" 00:08:38.290 04:58:58 nvme.nvme_fio -- nvme/nvme.sh@35 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_identify -r 'trtype:PCIe traddr:0000:00:10.0' 00:08:38.290 04:58:58 nvme.nvme_fio -- nvme/nvme.sh@35 -- # grep -qE '^Namespace ID:[0-9]+' 00:08:38.551 04:58:58 nvme.nvme_fio -- nvme/nvme.sh@38 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_identify -r 'trtype:PCIe traddr:0000:00:10.0' 00:08:38.551 04:58:58 nvme.nvme_fio -- nvme/nvme.sh@38 -- # grep -q 'Extended Data LBA' 00:08:38.551 04:58:58 nvme.nvme_fio -- nvme/nvme.sh@41 -- # bs=4096 00:08:38.551 04:58:58 nvme.nvme_fio -- nvme/nvme.sh@43 -- # fio_nvme /home/vagrant/spdk_repo/spdk/app/fio/nvme/example_config.fio '--filename=trtype=PCIe traddr=0000.00.10.0' --bs=4096 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1364 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme /home/vagrant/spdk_repo/spdk/app/fio/nvme/example_config.fio '--filename=trtype=PCIe traddr=0000.00.10.0' --bs=4096 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1343 -- # local sanitizers 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1345 -- # shift 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1347 -- # local asan_lib= 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # grep libasan 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1351 -- # break 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme' 00:08:38.551 04:58:58 nvme.nvme_fio -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio /home/vagrant/spdk_repo/spdk/app/fio/nvme/example_config.fio '--filename=trtype=PCIe traddr=0000.00.10.0' --bs=4096 00:08:38.812 test: (g=0): rw=randrw, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk, iodepth=128 00:08:38.812 fio-3.35 00:08:38.812 Starting 1 thread 00:08:45.396 00:08:45.396 test: (groupid=0, jobs=1): err= 0: pid=77797: Sun Dec 15 04:59:04 2024 00:08:45.396 read: IOPS=21.3k, BW=83.3MiB/s (87.4MB/s)(167MiB/2001msec) 00:08:45.396 slat (nsec): min=3962, max=69781, avg=5747.45, stdev=2223.55 00:08:45.396 clat (usec): min=343, max=9191, avg=2990.29, stdev=917.27 00:08:45.396 lat (usec): min=348, max=9205, avg=2996.04, stdev=918.43 00:08:45.396 clat percentiles (usec): 00:08:45.396 | 1.00th=[ 2147], 5.00th=[ 2311], 10.00th=[ 2376], 20.00th=[ 2507], 00:08:45.396 | 30.00th=[ 2573], 40.00th=[ 2638], 50.00th=[ 2704], 60.00th=[ 2769], 00:08:45.396 | 70.00th=[ 2900], 80.00th=[ 3163], 90.00th=[ 3949], 95.00th=[ 5014], 00:08:45.396 | 99.00th=[ 6783], 99.50th=[ 7570], 99.90th=[ 8717], 99.95th=[ 8848], 00:08:45.396 | 99.99th=[ 9110] 00:08:45.396 bw ( KiB/s): min=82952, max=84840, per=98.21%, avg=83808.00, stdev=956.23, samples=3 00:08:45.396 iops : min=20738, max=21210, avg=20952.00, stdev=239.06, samples=3 00:08:45.396 write: IOPS=21.2k, BW=82.7MiB/s (86.7MB/s)(166MiB/2001msec); 0 zone resets 00:08:45.396 slat (nsec): min=4220, max=68550, avg=6125.34, stdev=2256.92 00:08:45.396 clat (usec): min=303, max=9280, avg=3009.47, stdev=928.38 00:08:45.396 lat (usec): min=309, max=9295, avg=3015.59, stdev=929.56 00:08:45.396 clat percentiles (usec): 00:08:45.396 | 1.00th=[ 2180], 5.00th=[ 2343], 10.00th=[ 2409], 20.00th=[ 2507], 00:08:45.396 | 30.00th=[ 2573], 40.00th=[ 2638], 50.00th=[ 2704], 60.00th=[ 2802], 00:08:45.396 | 70.00th=[ 2933], 80.00th=[ 3195], 90.00th=[ 4015], 95.00th=[ 5145], 00:08:45.396 | 99.00th=[ 6783], 99.50th=[ 7439], 99.90th=[ 8848], 99.95th=[ 8979], 00:08:45.396 | 99.99th=[ 9110] 00:08:45.396 bw ( KiB/s): min=82960, max=84976, per=99.01%, avg=83869.33, stdev=1022.38, samples=3 00:08:45.396 iops : min=20740, max=21244, avg=20967.33, stdev=255.60, samples=3 00:08:45.396 lat (usec) : 500=0.01%, 750=0.03%, 1000=0.02% 00:08:45.396 lat (msec) : 2=0.29%, 4=89.79%, 10=9.86% 00:08:45.396 cpu : usr=99.20%, sys=0.00%, ctx=5, majf=0, minf=627 00:08:45.396 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.1%, >=64=99.9% 00:08:45.396 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:45.396 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:08:45.397 issued rwts: total=42690,42374,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:45.397 latency : target=0, window=0, percentile=100.00%, depth=128 00:08:45.397 00:08:45.397 Run status group 0 (all jobs): 00:08:45.397 READ: bw=83.3MiB/s (87.4MB/s), 83.3MiB/s-83.3MiB/s (87.4MB/s-87.4MB/s), io=167MiB (175MB), run=2001-2001msec 00:08:45.397 WRITE: bw=82.7MiB/s (86.7MB/s), 82.7MiB/s-82.7MiB/s (86.7MB/s-86.7MB/s), io=166MiB (174MB), run=2001-2001msec 00:08:45.397 ----------------------------------------------------- 00:08:45.397 Suppressions used: 00:08:45.397 count bytes template 00:08:45.397 1 32 /usr/src/fio/parse.c 00:08:45.397 1 8 libtcmalloc_minimal.so 00:08:45.397 ----------------------------------------------------- 00:08:45.397 00:08:45.397 04:59:04 nvme.nvme_fio -- nvme/nvme.sh@44 -- # ran_fio=true 00:08:45.397 04:59:04 nvme.nvme_fio -- nvme/nvme.sh@34 -- # for bdf in "${bdfs[@]}" 00:08:45.397 04:59:04 nvme.nvme_fio -- nvme/nvme.sh@35 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_identify -r 'trtype:PCIe traddr:0000:00:11.0' 00:08:45.397 04:59:04 nvme.nvme_fio -- nvme/nvme.sh@35 -- # grep -qE '^Namespace ID:[0-9]+' 00:08:45.397 04:59:04 nvme.nvme_fio -- nvme/nvme.sh@38 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_identify -r 'trtype:PCIe traddr:0000:00:11.0' 00:08:45.397 04:59:04 nvme.nvme_fio -- nvme/nvme.sh@38 -- # grep -q 'Extended Data LBA' 00:08:45.397 04:59:05 nvme.nvme_fio -- nvme/nvme.sh@41 -- # bs=4096 00:08:45.397 04:59:05 nvme.nvme_fio -- nvme/nvme.sh@43 -- # fio_nvme /home/vagrant/spdk_repo/spdk/app/fio/nvme/example_config.fio '--filename=trtype=PCIe traddr=0000.00.11.0' --bs=4096 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1364 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme /home/vagrant/spdk_repo/spdk/app/fio/nvme/example_config.fio '--filename=trtype=PCIe traddr=0000.00.11.0' --bs=4096 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1343 -- # local sanitizers 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1345 -- # shift 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1347 -- # local asan_lib= 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # grep libasan 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1351 -- # break 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme' 00:08:45.397 04:59:05 nvme.nvme_fio -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio /home/vagrant/spdk_repo/spdk/app/fio/nvme/example_config.fio '--filename=trtype=PCIe traddr=0000.00.11.0' --bs=4096 00:08:45.397 test: (g=0): rw=randrw, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk, iodepth=128 00:08:45.397 fio-3.35 00:08:45.397 Starting 1 thread 00:08:51.981 00:08:51.981 test: (groupid=0, jobs=1): err= 0: pid=77852: Sun Dec 15 04:59:11 2024 00:08:51.981 read: IOPS=22.7k, BW=88.8MiB/s (93.1MB/s)(178MiB/2001msec) 00:08:51.981 slat (nsec): min=4233, max=52020, avg=4936.50, stdev=1884.12 00:08:51.981 clat (usec): min=771, max=9265, avg=2808.18, stdev=768.30 00:08:51.981 lat (usec): min=784, max=9305, avg=2813.12, stdev=769.42 00:08:51.981 clat percentiles (usec): 00:08:51.981 | 1.00th=[ 2024], 5.00th=[ 2180], 10.00th=[ 2278], 20.00th=[ 2376], 00:08:51.981 | 30.00th=[ 2474], 40.00th=[ 2540], 50.00th=[ 2606], 60.00th=[ 2671], 00:08:51.981 | 70.00th=[ 2769], 80.00th=[ 2966], 90.00th=[ 3523], 95.00th=[ 4424], 00:08:51.981 | 99.00th=[ 6259], 99.50th=[ 6652], 99.90th=[ 7046], 99.95th=[ 7373], 00:08:51.981 | 99.99th=[ 9110] 00:08:51.981 bw ( KiB/s): min=85416, max=92744, per=97.93%, avg=89045.33, stdev=3664.49, samples=3 00:08:51.981 iops : min=21354, max=23186, avg=22261.33, stdev=916.12, samples=3 00:08:51.981 write: IOPS=22.6k, BW=88.3MiB/s (92.6MB/s)(177MiB/2001msec); 0 zone resets 00:08:51.981 slat (nsec): min=4317, max=75958, avg=5227.55, stdev=2094.35 00:08:51.981 clat (usec): min=681, max=9179, avg=2822.43, stdev=783.53 00:08:51.981 lat (usec): min=695, max=9196, avg=2827.66, stdev=784.69 00:08:51.981 clat percentiles (usec): 00:08:51.981 | 1.00th=[ 2040], 5.00th=[ 2212], 10.00th=[ 2278], 20.00th=[ 2376], 00:08:51.981 | 30.00th=[ 2474], 40.00th=[ 2540], 50.00th=[ 2606], 60.00th=[ 2671], 00:08:51.981 | 70.00th=[ 2802], 80.00th=[ 2999], 90.00th=[ 3556], 95.00th=[ 4490], 00:08:51.981 | 99.00th=[ 6325], 99.50th=[ 6652], 99.90th=[ 7046], 99.95th=[ 7570], 00:08:51.981 | 99.99th=[ 8979] 00:08:51.981 bw ( KiB/s): min=85144, max=92400, per=98.72%, avg=89232.00, stdev=3714.46, samples=3 00:08:51.981 iops : min=21286, max=23100, avg=22308.00, stdev=928.61, samples=3 00:08:51.981 lat (usec) : 750=0.01%, 1000=0.01% 00:08:51.981 lat (msec) : 2=0.70%, 4=92.62%, 10=6.67% 00:08:51.981 cpu : usr=99.25%, sys=0.10%, ctx=3, majf=0, minf=627 00:08:51.981 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.1%, >=64=99.9% 00:08:51.981 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:51.981 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:08:51.981 issued rwts: total=45486,45216,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:51.981 latency : target=0, window=0, percentile=100.00%, depth=128 00:08:51.981 00:08:51.981 Run status group 0 (all jobs): 00:08:51.981 READ: bw=88.8MiB/s (93.1MB/s), 88.8MiB/s-88.8MiB/s (93.1MB/s-93.1MB/s), io=178MiB (186MB), run=2001-2001msec 00:08:51.981 WRITE: bw=88.3MiB/s (92.6MB/s), 88.3MiB/s-88.3MiB/s (92.6MB/s-92.6MB/s), io=177MiB (185MB), run=2001-2001msec 00:08:51.981 ----------------------------------------------------- 00:08:51.981 Suppressions used: 00:08:51.981 count bytes template 00:08:51.981 1 32 /usr/src/fio/parse.c 00:08:51.981 1 8 libtcmalloc_minimal.so 00:08:51.981 ----------------------------------------------------- 00:08:51.981 00:08:51.981 04:59:11 nvme.nvme_fio -- nvme/nvme.sh@44 -- # ran_fio=true 00:08:51.981 04:59:11 nvme.nvme_fio -- nvme/nvme.sh@34 -- # for bdf in "${bdfs[@]}" 00:08:51.981 04:59:11 nvme.nvme_fio -- nvme/nvme.sh@35 -- # grep -qE '^Namespace ID:[0-9]+' 00:08:51.981 04:59:11 nvme.nvme_fio -- nvme/nvme.sh@35 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_identify -r 'trtype:PCIe traddr:0000:00:12.0' 00:08:52.243 04:59:12 nvme.nvme_fio -- nvme/nvme.sh@38 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_identify -r 'trtype:PCIe traddr:0000:00:12.0' 00:08:52.243 04:59:12 nvme.nvme_fio -- nvme/nvme.sh@38 -- # grep -q 'Extended Data LBA' 00:08:52.243 04:59:12 nvme.nvme_fio -- nvme/nvme.sh@41 -- # bs=4096 00:08:52.243 04:59:12 nvme.nvme_fio -- nvme/nvme.sh@43 -- # fio_nvme /home/vagrant/spdk_repo/spdk/app/fio/nvme/example_config.fio '--filename=trtype=PCIe traddr=0000.00.12.0' --bs=4096 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1364 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme /home/vagrant/spdk_repo/spdk/app/fio/nvme/example_config.fio '--filename=trtype=PCIe traddr=0000.00.12.0' --bs=4096 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1343 -- # local sanitizers 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1345 -- # shift 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1347 -- # local asan_lib= 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # grep libasan 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1351 -- # break 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme' 00:08:52.243 04:59:12 nvme.nvme_fio -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio /home/vagrant/spdk_repo/spdk/app/fio/nvme/example_config.fio '--filename=trtype=PCIe traddr=0000.00.12.0' --bs=4096 00:08:52.504 test: (g=0): rw=randrw, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk, iodepth=128 00:08:52.504 fio-3.35 00:08:52.504 Starting 1 thread 00:09:00.706 00:09:00.706 test: (groupid=0, jobs=1): err= 0: pid=77913: Sun Dec 15 04:59:19 2024 00:09:00.706 read: IOPS=23.2k, BW=90.8MiB/s (95.2MB/s)(182MiB/2001msec) 00:09:00.706 slat (nsec): min=4191, max=82473, avg=4873.91, stdev=1915.97 00:09:00.706 clat (usec): min=317, max=11677, avg=2744.79, stdev=742.67 00:09:00.706 lat (usec): min=322, max=11732, avg=2749.67, stdev=743.88 00:09:00.706 clat percentiles (usec): 00:09:00.706 | 1.00th=[ 2089], 5.00th=[ 2212], 10.00th=[ 2278], 20.00th=[ 2376], 00:09:00.706 | 30.00th=[ 2442], 40.00th=[ 2507], 50.00th=[ 2573], 60.00th=[ 2638], 00:09:00.706 | 70.00th=[ 2737], 80.00th=[ 2835], 90.00th=[ 3130], 95.00th=[ 4047], 00:09:00.706 | 99.00th=[ 6390], 99.50th=[ 6718], 99.90th=[ 7570], 99.95th=[ 8848], 00:09:00.706 | 99.99th=[11338] 00:09:00.706 bw ( KiB/s): min=89192, max=95784, per=98.41%, avg=91458.67, stdev=3747.29, samples=3 00:09:00.706 iops : min=22298, max=23946, avg=22864.67, stdev=936.82, samples=3 00:09:00.706 write: IOPS=23.1k, BW=90.2MiB/s (94.6MB/s)(180MiB/2001msec); 0 zone resets 00:09:00.706 slat (nsec): min=4311, max=44921, avg=5131.39, stdev=1848.28 00:09:00.706 clat (usec): min=206, max=11515, avg=2762.78, stdev=748.52 00:09:00.706 lat (usec): min=211, max=11535, avg=2767.91, stdev=749.70 00:09:00.706 clat percentiles (usec): 00:09:00.706 | 1.00th=[ 2114], 5.00th=[ 2245], 10.00th=[ 2311], 20.00th=[ 2376], 00:09:00.706 | 30.00th=[ 2442], 40.00th=[ 2507], 50.00th=[ 2573], 60.00th=[ 2671], 00:09:00.706 | 70.00th=[ 2737], 80.00th=[ 2868], 90.00th=[ 3163], 95.00th=[ 4228], 00:09:00.706 | 99.00th=[ 6390], 99.50th=[ 6718], 99.90th=[ 7504], 99.95th=[ 8979], 00:09:00.706 | 99.99th=[11076] 00:09:00.706 bw ( KiB/s): min=88936, max=96824, per=99.15%, avg=91570.67, stdev=4549.53, samples=3 00:09:00.706 iops : min=22234, max=24206, avg=22892.67, stdev=1137.38, samples=3 00:09:00.706 lat (usec) : 250=0.01%, 500=0.02%, 750=0.01%, 1000=0.01% 00:09:00.706 lat (msec) : 2=0.28%, 4=94.47%, 10=5.18%, 20=0.03% 00:09:00.706 cpu : usr=99.35%, sys=0.10%, ctx=3, majf=0, minf=628 00:09:00.706 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.1%, >=64=99.9% 00:09:00.706 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:00.706 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:09:00.706 issued rwts: total=46489,46201,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:00.706 latency : target=0, window=0, percentile=100.00%, depth=128 00:09:00.706 00:09:00.706 Run status group 0 (all jobs): 00:09:00.706 READ: bw=90.8MiB/s (95.2MB/s), 90.8MiB/s-90.8MiB/s (95.2MB/s-95.2MB/s), io=182MiB (190MB), run=2001-2001msec 00:09:00.706 WRITE: bw=90.2MiB/s (94.6MB/s), 90.2MiB/s-90.2MiB/s (94.6MB/s-94.6MB/s), io=180MiB (189MB), run=2001-2001msec 00:09:00.706 ----------------------------------------------------- 00:09:00.706 Suppressions used: 00:09:00.706 count bytes template 00:09:00.706 1 32 /usr/src/fio/parse.c 00:09:00.706 1 8 libtcmalloc_minimal.so 00:09:00.706 ----------------------------------------------------- 00:09:00.706 00:09:00.706 04:59:19 nvme.nvme_fio -- nvme/nvme.sh@44 -- # ran_fio=true 00:09:00.706 04:59:19 nvme.nvme_fio -- nvme/nvme.sh@34 -- # for bdf in "${bdfs[@]}" 00:09:00.706 04:59:19 nvme.nvme_fio -- nvme/nvme.sh@35 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_identify -r 'trtype:PCIe traddr:0000:00:13.0' 00:09:00.706 04:59:19 nvme.nvme_fio -- nvme/nvme.sh@35 -- # grep -qE '^Namespace ID:[0-9]+' 00:09:00.706 04:59:19 nvme.nvme_fio -- nvme/nvme.sh@38 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_nvme_identify -r 'trtype:PCIe traddr:0000:00:13.0' 00:09:00.706 04:59:19 nvme.nvme_fio -- nvme/nvme.sh@38 -- # grep -q 'Extended Data LBA' 00:09:00.706 04:59:20 nvme.nvme_fio -- nvme/nvme.sh@41 -- # bs=4096 00:09:00.706 04:59:20 nvme.nvme_fio -- nvme/nvme.sh@43 -- # fio_nvme /home/vagrant/spdk_repo/spdk/app/fio/nvme/example_config.fio '--filename=trtype=PCIe traddr=0000.00.13.0' --bs=4096 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1364 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme /home/vagrant/spdk_repo/spdk/app/fio/nvme/example_config.fio '--filename=trtype=PCIe traddr=0000.00.13.0' --bs=4096 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1343 -- # local sanitizers 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1345 -- # shift 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1347 -- # local asan_lib= 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # grep libasan 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1351 -- # break 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_nvme' 00:09:00.706 04:59:20 nvme.nvme_fio -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio /home/vagrant/spdk_repo/spdk/app/fio/nvme/example_config.fio '--filename=trtype=PCIe traddr=0000.00.13.0' --bs=4096 00:09:00.706 test: (g=0): rw=randrw, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk, iodepth=128 00:09:00.706 fio-3.35 00:09:00.706 Starting 1 thread 00:09:05.996 00:09:05.996 test: (groupid=0, jobs=1): err= 0: pid=77974: Sun Dec 15 04:59:25 2024 00:09:05.996 read: IOPS=23.7k, BW=92.7MiB/s (97.2MB/s)(185MiB/2001msec) 00:09:05.996 slat (nsec): min=4198, max=71120, avg=4828.32, stdev=1799.87 00:09:05.996 clat (usec): min=260, max=10015, avg=2690.13, stdev=700.49 00:09:05.997 lat (usec): min=265, max=10056, avg=2694.95, stdev=701.58 00:09:05.997 clat percentiles (usec): 00:09:05.997 | 1.00th=[ 2057], 5.00th=[ 2180], 10.00th=[ 2245], 20.00th=[ 2343], 00:09:05.997 | 30.00th=[ 2409], 40.00th=[ 2474], 50.00th=[ 2507], 60.00th=[ 2573], 00:09:05.997 | 70.00th=[ 2671], 80.00th=[ 2802], 90.00th=[ 3097], 95.00th=[ 3982], 00:09:05.997 | 99.00th=[ 5997], 99.50th=[ 6325], 99.90th=[ 7635], 99.95th=[ 8291], 00:09:05.997 | 99.99th=[ 9896] 00:09:05.997 bw ( KiB/s): min=91248, max=99512, per=100.00%, avg=95442.67, stdev=4133.43, samples=3 00:09:05.997 iops : min=22812, max=24876, avg=23860.00, stdev=1032.37, samples=3 00:09:05.997 write: IOPS=23.6k, BW=92.1MiB/s (96.5MB/s)(184MiB/2001msec); 0 zone resets 00:09:05.997 slat (usec): min=4, max=125, avg= 5.11, stdev= 1.87 00:09:05.997 clat (usec): min=275, max=9951, avg=2706.01, stdev=718.33 00:09:05.997 lat (usec): min=280, max=9965, avg=2711.13, stdev=719.44 00:09:05.997 clat percentiles (usec): 00:09:05.997 | 1.00th=[ 2073], 5.00th=[ 2180], 10.00th=[ 2245], 20.00th=[ 2343], 00:09:05.997 | 30.00th=[ 2409], 40.00th=[ 2474], 50.00th=[ 2540], 60.00th=[ 2606], 00:09:05.997 | 70.00th=[ 2671], 80.00th=[ 2802], 90.00th=[ 3097], 95.00th=[ 4080], 00:09:05.997 | 99.00th=[ 6063], 99.50th=[ 6456], 99.90th=[ 7701], 99.95th=[ 8225], 00:09:05.997 | 99.99th=[ 9765] 00:09:05.997 bw ( KiB/s): min=90896, max=98984, per=100.00%, avg=95426.67, stdev=4130.92, samples=3 00:09:05.997 iops : min=22724, max=24746, avg=23856.67, stdev=1032.73, samples=3 00:09:05.997 lat (usec) : 500=0.02%, 750=0.01%, 1000=0.02% 00:09:05.997 lat (msec) : 2=0.40%, 4=94.49%, 10=5.07%, 20=0.01% 00:09:05.997 cpu : usr=99.35%, sys=0.05%, ctx=5, majf=0, minf=626 00:09:05.997 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.1%, >=64=99.9% 00:09:05.997 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:05.997 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:09:05.997 issued rwts: total=47463,47166,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:05.997 latency : target=0, window=0, percentile=100.00%, depth=128 00:09:05.997 00:09:05.997 Run status group 0 (all jobs): 00:09:05.997 READ: bw=92.7MiB/s (97.2MB/s), 92.7MiB/s-92.7MiB/s (97.2MB/s-97.2MB/s), io=185MiB (194MB), run=2001-2001msec 00:09:05.997 WRITE: bw=92.1MiB/s (96.5MB/s), 92.1MiB/s-92.1MiB/s (96.5MB/s-96.5MB/s), io=184MiB (193MB), run=2001-2001msec 00:09:05.997 ----------------------------------------------------- 00:09:05.997 Suppressions used: 00:09:05.997 count bytes template 00:09:05.997 1 32 /usr/src/fio/parse.c 00:09:05.997 1 8 libtcmalloc_minimal.so 00:09:05.997 ----------------------------------------------------- 00:09:05.997 00:09:05.997 04:59:25 nvme.nvme_fio -- nvme/nvme.sh@44 -- # ran_fio=true 00:09:05.997 04:59:25 nvme.nvme_fio -- nvme/nvme.sh@46 -- # true 00:09:05.997 00:09:05.997 real 0m27.624s 00:09:05.997 user 0m20.340s 00:09:05.997 sys 0m11.403s 00:09:05.997 04:59:25 nvme.nvme_fio -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:05.997 04:59:25 nvme.nvme_fio -- common/autotest_common.sh@10 -- # set +x 00:09:05.997 ************************************ 00:09:05.997 END TEST nvme_fio 00:09:05.997 ************************************ 00:09:05.997 00:09:05.997 real 1m34.828s 00:09:05.997 user 3m34.812s 00:09:05.997 sys 0m21.636s 00:09:05.997 04:59:25 nvme -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:05.997 04:59:25 nvme -- common/autotest_common.sh@10 -- # set +x 00:09:05.997 ************************************ 00:09:05.997 END TEST nvme 00:09:05.997 ************************************ 00:09:05.997 04:59:25 -- spdk/autotest.sh@213 -- # [[ 0 -eq 1 ]] 00:09:05.997 04:59:25 -- spdk/autotest.sh@217 -- # run_test nvme_scc /home/vagrant/spdk_repo/spdk/test/nvme/nvme_scc.sh 00:09:05.997 04:59:25 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:09:05.997 04:59:25 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:09:05.997 04:59:25 -- common/autotest_common.sh@10 -- # set +x 00:09:05.997 ************************************ 00:09:05.997 START TEST nvme_scc 00:09:05.997 ************************************ 00:09:05.997 04:59:25 nvme_scc -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/nvme_scc.sh 00:09:05.997 * Looking for test storage... 00:09:05.997 * Found test storage at /home/vagrant/spdk_repo/spdk/test/nvme 00:09:05.997 04:59:25 nvme_scc -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:09:05.997 04:59:25 nvme_scc -- common/autotest_common.sh@1711 -- # lcov --version 00:09:05.997 04:59:25 nvme_scc -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:09:05.997 04:59:25 nvme_scc -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@336 -- # IFS=.-: 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@336 -- # read -ra ver1 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@337 -- # IFS=.-: 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@337 -- # read -ra ver2 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@338 -- # local 'op=<' 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@340 -- # ver1_l=2 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@341 -- # ver2_l=1 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@344 -- # case "$op" in 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@345 -- # : 1 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@365 -- # decimal 1 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@353 -- # local d=1 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@355 -- # echo 1 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@365 -- # ver1[v]=1 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@366 -- # decimal 2 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@353 -- # local d=2 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@355 -- # echo 2 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@366 -- # ver2[v]=2 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@368 -- # return 0 00:09:05.997 04:59:25 nvme_scc -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:05.997 04:59:25 nvme_scc -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:09:05.997 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:05.997 --rc genhtml_branch_coverage=1 00:09:05.997 --rc genhtml_function_coverage=1 00:09:05.997 --rc genhtml_legend=1 00:09:05.997 --rc geninfo_all_blocks=1 00:09:05.997 --rc geninfo_unexecuted_blocks=1 00:09:05.997 00:09:05.997 ' 00:09:05.997 04:59:25 nvme_scc -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:09:05.997 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:05.997 --rc genhtml_branch_coverage=1 00:09:05.997 --rc genhtml_function_coverage=1 00:09:05.997 --rc genhtml_legend=1 00:09:05.997 --rc geninfo_all_blocks=1 00:09:05.997 --rc geninfo_unexecuted_blocks=1 00:09:05.997 00:09:05.997 ' 00:09:05.997 04:59:25 nvme_scc -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:09:05.997 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:05.997 --rc genhtml_branch_coverage=1 00:09:05.997 --rc genhtml_function_coverage=1 00:09:05.997 --rc genhtml_legend=1 00:09:05.997 --rc geninfo_all_blocks=1 00:09:05.997 --rc geninfo_unexecuted_blocks=1 00:09:05.997 00:09:05.997 ' 00:09:05.997 04:59:25 nvme_scc -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:09:05.997 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:05.997 --rc genhtml_branch_coverage=1 00:09:05.997 --rc genhtml_function_coverage=1 00:09:05.997 --rc genhtml_legend=1 00:09:05.997 --rc geninfo_all_blocks=1 00:09:05.997 --rc geninfo_unexecuted_blocks=1 00:09:05.997 00:09:05.997 ' 00:09:05.997 04:59:25 nvme_scc -- cuse/common.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/common/nvme/functions.sh 00:09:05.997 04:59:25 nvme_scc -- nvme/functions.sh@7 -- # dirname /home/vagrant/spdk_repo/spdk/test/common/nvme/functions.sh 00:09:05.997 04:59:25 nvme_scc -- nvme/functions.sh@7 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/common/nvme/../../../ 00:09:05.997 04:59:25 nvme_scc -- nvme/functions.sh@7 -- # rootdir=/home/vagrant/spdk_repo/spdk 00:09:05.997 04:59:25 nvme_scc -- nvme/functions.sh@8 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@15 -- # shopt -s extglob 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:05.997 04:59:25 nvme_scc -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:05.997 04:59:25 nvme_scc -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:05.997 04:59:25 nvme_scc -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:05.997 04:59:25 nvme_scc -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:05.997 04:59:25 nvme_scc -- paths/export.sh@5 -- # export PATH 00:09:05.997 04:59:25 nvme_scc -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:05.998 04:59:25 nvme_scc -- nvme/functions.sh@10 -- # ctrls=() 00:09:05.998 04:59:25 nvme_scc -- nvme/functions.sh@10 -- # declare -A ctrls 00:09:05.998 04:59:25 nvme_scc -- nvme/functions.sh@11 -- # nvmes=() 00:09:05.998 04:59:25 nvme_scc -- nvme/functions.sh@11 -- # declare -A nvmes 00:09:05.998 04:59:25 nvme_scc -- nvme/functions.sh@12 -- # bdfs=() 00:09:05.998 04:59:25 nvme_scc -- nvme/functions.sh@12 -- # declare -A bdfs 00:09:05.998 04:59:25 nvme_scc -- nvme/functions.sh@13 -- # ordered_ctrls=() 00:09:05.998 04:59:25 nvme_scc -- nvme/functions.sh@13 -- # declare -a ordered_ctrls 00:09:05.998 04:59:25 nvme_scc -- nvme/functions.sh@14 -- # nvme_name= 00:09:05.998 04:59:25 nvme_scc -- cuse/common.sh@11 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:09:05.998 04:59:25 nvme_scc -- nvme/nvme_scc.sh@12 -- # uname 00:09:05.998 04:59:26 nvme_scc -- nvme/nvme_scc.sh@12 -- # [[ Linux == Linux ]] 00:09:05.998 04:59:26 nvme_scc -- nvme/nvme_scc.sh@12 -- # [[ QEMU == QEMU ]] 00:09:05.998 04:59:26 nvme_scc -- nvme/nvme_scc.sh@14 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:09:06.259 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:09:06.520 Waiting for block devices as requested 00:09:06.520 0000:00:11.0 (1b36 0010): uio_pci_generic -> nvme 00:09:06.520 0000:00:10.0 (1b36 0010): uio_pci_generic -> nvme 00:09:06.520 0000:00:12.0 (1b36 0010): uio_pci_generic -> nvme 00:09:06.520 0000:00:13.0 (1b36 0010): uio_pci_generic -> nvme 00:09:11.843 * Events for some block/disk devices (0000:00:13.0) were not caught, they may be missing 00:09:11.843 04:59:31 nvme_scc -- nvme/nvme_scc.sh@16 -- # scan_nvme_ctrls 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@45 -- # local ctrl ctrl_dev reg val ns pci 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@47 -- # for ctrl in /sys/class/nvme/nvme* 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@48 -- # [[ -e /sys/class/nvme/nvme0 ]] 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@49 -- # pci=0000:00:11.0 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@50 -- # pci_can_use 0000:00:11.0 00:09:11.843 04:59:31 nvme_scc -- scripts/common.sh@18 -- # local i 00:09:11.843 04:59:31 nvme_scc -- scripts/common.sh@21 -- # [[ =~ 0000:00:11.0 ]] 00:09:11.843 04:59:31 nvme_scc -- scripts/common.sh@25 -- # [[ -z '' ]] 00:09:11.843 04:59:31 nvme_scc -- scripts/common.sh@27 -- # return 0 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@51 -- # ctrl_dev=nvme0 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@52 -- # nvme_get nvme0 id-ctrl /dev/nvme0 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@17 -- # local ref=nvme0 reg val 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@18 -- # shift 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@20 -- # local -gA 'nvme0=()' 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ctrl /dev/nvme0 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1b36 ]] 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[vid]="0x1b36"' 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[vid]=0x1b36 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1af4 ]] 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[ssvid]="0x1af4"' 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[ssvid]=0x1af4 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 12341 ]] 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[sn]="12341 "' 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[sn]='12341 ' 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n QEMU NVMe Ctrl ]] 00:09:11.843 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[mn]="QEMU NVMe Ctrl "' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[mn]='QEMU NVMe Ctrl ' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 8.0.0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[fr]="8.0.0 "' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[fr]='8.0.0 ' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 6 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[rab]="6"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[rab]=6 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 525400 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[ieee]="525400"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[ieee]=525400 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[cmic]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[cmic]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[mdts]="7"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[mdts]=7 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[cntlid]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[cntlid]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x10400 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[ver]="0x10400"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[ver]=0x10400 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[rtd3r]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[rtd3r]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[rtd3e]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[rtd3e]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[oaes]="0x100"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[oaes]=0x100 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x8000 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[ctratt]="0x8000"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[ctratt]=0x8000 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[rrls]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[rrls]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[cntrltype]="1"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[cntrltype]=1 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 00000000-0000-0000-0000-000000000000 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[fguid]="00000000-0000-0000-0000-000000000000"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[fguid]=00000000-0000-0000-0000-000000000000 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[crdt1]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[crdt1]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[crdt2]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[crdt2]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[crdt3]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[crdt3]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[nvmsr]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[nvmsr]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[vwci]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[vwci]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[mec]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[mec]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x12a ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[oacs]="0x12a"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[oacs]=0x12a 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[acl]="3"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[acl]=3 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[aerl]="3"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[aerl]=3 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[frmw]="0x3"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[frmw]=0x3 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[lpa]="0x7"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[lpa]=0x7 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[elpe]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[elpe]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[npss]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[npss]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[avscc]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[avscc]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[apsta]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[apsta]=0 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 343 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[wctemp]="343"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[wctemp]=343 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 373 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[cctemp]="373"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[cctemp]=373 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[mtfa]="0"' 00:09:11.844 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[mtfa]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[hmpre]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[hmpre]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[hmmin]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[hmmin]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[tnvmcap]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[tnvmcap]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[unvmcap]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[unvmcap]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[rpmbs]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[rpmbs]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[edstt]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[edstt]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[dsto]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[dsto]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[fwug]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[fwug]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[kas]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[kas]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[hctma]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[hctma]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[mntmt]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[mntmt]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[mxtmt]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[mxtmt]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[sanicap]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[sanicap]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[hmminds]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[hmminds]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[hmmaxd]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[hmmaxd]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[nsetidmax]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[nsetidmax]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[endgidmax]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[endgidmax]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[anatt]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[anatt]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[anacap]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[anacap]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[anagrpmax]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[anagrpmax]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[nanagrpid]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[nanagrpid]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[pels]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[pels]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[domainid]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[domainid]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[megcap]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[megcap]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x66 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[sqes]="0x66"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[sqes]=0x66 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x44 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[cqes]="0x44"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[cqes]=0x44 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[maxcmd]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[maxcmd]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 256 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[nn]="256"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[nn]=256 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x15d ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[oncs]="0x15d"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[oncs]=0x15d 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[fuses]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[fuses]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[fna]="0"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[fna]=0 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[vwc]="0x7"' 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[vwc]=0x7 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.845 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[awun]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[awun]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[awupf]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[awupf]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[icsvscc]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[icsvscc]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[nwpc]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[nwpc]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[acwu]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[acwu]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[ocfs]="0x3"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[ocfs]=0x3 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[sgls]="0x1"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[sgls]=0x1 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[mnan]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[mnan]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[maxdna]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[maxdna]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[maxcna]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[maxcna]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n nqn.2019-08.org.qemu:12341 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[subnqn]="nqn.2019-08.org.qemu:12341"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[subnqn]=nqn.2019-08.org.qemu:12341 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[ioccsz]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[ioccsz]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[iorcsz]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[iorcsz]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[icdoff]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[icdoff]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[fcatt]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[fcatt]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[msdbd]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[msdbd]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[ofcs]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[ofcs]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[ps0]="mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[ps0]='mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 rwl:0 idle_power:- active_power:- ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[rwt]="0 rwl:0 idle_power:- active_power:-"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[rwt]='0 rwl:0 idle_power:- active_power:-' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n - ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0[active_power_workload]="-"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0[active_power_workload]=- 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@53 -- # local -n _ctrl_ns=nvme0_ns 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme0/ng0n1 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@56 -- # ns_dev=ng0n1 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@57 -- # nvme_get ng0n1 id-ns /dev/ng0n1 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@17 -- # local ref=ng0n1 reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@18 -- # shift 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@20 -- # local -gA 'ng0n1=()' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/ng0n1 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x140000 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nsze]="0x140000"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nsze]=0x140000 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x140000 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[ncap]="0x140000"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[ncap]=0x140000 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x140000 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nuse]="0x140000"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nuse]=0x140000 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nsfeat]="0x14"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nsfeat]=0x14 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nlbaf]="7"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nlbaf]=7 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[flbas]="0x4"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[flbas]=0x4 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[mc]="0x3"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[mc]=0x3 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[dpc]="0x1f"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[dpc]=0x1f 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[dps]="0"' 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[dps]=0 00:09:11.846 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nmic]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nmic]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[rescap]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[rescap]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[fpi]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[fpi]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[dlfeat]="1"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[dlfeat]=1 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nawun]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nawun]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nawupf]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nawupf]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nacwu]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nacwu]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nabsn]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nabsn]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nabo]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nabo]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nabspf]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nabspf]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[noiob]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[noiob]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nvmcap]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nvmcap]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[npwg]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[npwg]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[npwa]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[npwa]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[npdg]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[npdg]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[npda]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[npda]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nows]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nows]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[mssrl]="128"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[mssrl]=128 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[mcl]="128"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[mcl]=128 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[msrc]="127"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[msrc]=127 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nulbaf]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nulbaf]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[anagrpid]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[anagrpid]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nsattr]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nsattr]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nvmsetid]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nvmsetid]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[endgid]="0"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[endgid]=0 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[nguid]="00000000000000000000000000000000"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[nguid]=00000000000000000000000000000000 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[eui64]="0000000000000000"' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[eui64]=0000000000000000 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:11.847 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng0n1[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=ng0n1 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme0/nvme0n1 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@56 -- # ns_dev=nvme0n1 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@57 -- # nvme_get nvme0n1 id-ns /dev/nvme0n1 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@17 -- # local ref=nvme0n1 reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@18 -- # shift 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@20 -- # local -gA 'nvme0n1=()' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/nvme0n1 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x140000 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nsze]="0x140000"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nsze]=0x140000 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x140000 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[ncap]="0x140000"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[ncap]=0x140000 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x140000 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nuse]="0x140000"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nuse]=0x140000 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nsfeat]="0x14"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nsfeat]=0x14 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nlbaf]="7"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nlbaf]=7 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[flbas]="0x4"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[flbas]=0x4 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[mc]="0x3"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[mc]=0x3 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[dpc]="0x1f"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[dpc]=0x1f 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[dps]="0"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[dps]=0 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nmic]="0"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nmic]=0 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[rescap]="0"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[rescap]=0 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[fpi]="0"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[fpi]=0 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[dlfeat]="1"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[dlfeat]=1 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nawun]="0"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nawun]=0 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nawupf]="0"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nawupf]=0 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nacwu]="0"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nacwu]=0 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nabsn]="0"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nabsn]=0 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nabo]="0"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nabo]=0 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nabspf]="0"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nabspf]=0 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[noiob]="0"' 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[noiob]=0 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.848 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nvmcap]="0"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nvmcap]=0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[npwg]="0"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[npwg]=0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[npwa]="0"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[npwa]=0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[npdg]="0"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[npdg]=0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[npda]="0"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[npda]=0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nows]="0"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nows]=0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[mssrl]="128"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[mssrl]=128 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[mcl]="128"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[mcl]=128 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[msrc]="127"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[msrc]=127 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nulbaf]="0"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nulbaf]=0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[anagrpid]="0"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[anagrpid]=0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nsattr]="0"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nsattr]=0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nvmsetid]="0"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nvmsetid]=0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[endgid]="0"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[endgid]=0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[nguid]="00000000000000000000000000000000"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[nguid]=00000000000000000000000000000000 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[eui64]="0000000000000000"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[eui64]=0000000000000000 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme0n1[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=nvme0n1 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@60 -- # ctrls["$ctrl_dev"]=nvme0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@61 -- # nvmes["$ctrl_dev"]=nvme0_ns 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@62 -- # bdfs["$ctrl_dev"]=0000:00:11.0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@63 -- # ordered_ctrls[${ctrl_dev/nvme/}]=nvme0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@47 -- # for ctrl in /sys/class/nvme/nvme* 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@48 -- # [[ -e /sys/class/nvme/nvme1 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@49 -- # pci=0000:00:10.0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@50 -- # pci_can_use 0000:00:10.0 00:09:11.849 04:59:31 nvme_scc -- scripts/common.sh@18 -- # local i 00:09:11.849 04:59:31 nvme_scc -- scripts/common.sh@21 -- # [[ =~ 0000:00:10.0 ]] 00:09:11.849 04:59:31 nvme_scc -- scripts/common.sh@25 -- # [[ -z '' ]] 00:09:11.849 04:59:31 nvme_scc -- scripts/common.sh@27 -- # return 0 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@51 -- # ctrl_dev=nvme1 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@52 -- # nvme_get nvme1 id-ctrl /dev/nvme1 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@17 -- # local ref=nvme1 reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@18 -- # shift 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@20 -- # local -gA 'nvme1=()' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ctrl /dev/nvme1 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1b36 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[vid]="0x1b36"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[vid]=0x1b36 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1af4 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[ssvid]="0x1af4"' 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[ssvid]=0x1af4 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 12340 ]] 00:09:11.849 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[sn]="12340 "' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[sn]='12340 ' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n QEMU NVMe Ctrl ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[mn]="QEMU NVMe Ctrl "' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[mn]='QEMU NVMe Ctrl ' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 8.0.0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[fr]="8.0.0 "' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[fr]='8.0.0 ' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 6 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[rab]="6"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[rab]=6 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 525400 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[ieee]="525400"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[ieee]=525400 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[cmic]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[cmic]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[mdts]="7"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[mdts]=7 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[cntlid]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[cntlid]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x10400 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[ver]="0x10400"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[ver]=0x10400 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[rtd3r]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[rtd3r]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[rtd3e]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[rtd3e]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[oaes]="0x100"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[oaes]=0x100 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x8000 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[ctratt]="0x8000"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[ctratt]=0x8000 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[rrls]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[rrls]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[cntrltype]="1"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[cntrltype]=1 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 00000000-0000-0000-0000-000000000000 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[fguid]="00000000-0000-0000-0000-000000000000"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[fguid]=00000000-0000-0000-0000-000000000000 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[crdt1]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[crdt1]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[crdt2]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[crdt2]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[crdt3]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[crdt3]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[nvmsr]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[nvmsr]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[vwci]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[vwci]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[mec]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[mec]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x12a ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[oacs]="0x12a"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[oacs]=0x12a 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[acl]="3"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[acl]=3 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[aerl]="3"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[aerl]=3 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[frmw]="0x3"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[frmw]=0x3 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[lpa]="0x7"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[lpa]=0x7 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[elpe]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[elpe]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[npss]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[npss]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[avscc]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[avscc]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[apsta]="0"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[apsta]=0 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 343 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[wctemp]="343"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[wctemp]=343 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 373 ]] 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[cctemp]="373"' 00:09:11.850 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[cctemp]=373 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[mtfa]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[mtfa]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[hmpre]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[hmpre]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[hmmin]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[hmmin]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[tnvmcap]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[tnvmcap]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[unvmcap]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[unvmcap]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[rpmbs]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[rpmbs]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[edstt]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[edstt]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[dsto]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[dsto]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[fwug]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[fwug]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[kas]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[kas]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[hctma]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[hctma]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[mntmt]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[mntmt]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[mxtmt]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[mxtmt]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[sanicap]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[sanicap]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[hmminds]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[hmminds]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[hmmaxd]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[hmmaxd]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[nsetidmax]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[nsetidmax]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[endgidmax]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[endgidmax]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[anatt]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[anatt]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[anacap]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[anacap]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[anagrpmax]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[anagrpmax]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[nanagrpid]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[nanagrpid]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[pels]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[pels]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[domainid]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[domainid]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[megcap]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[megcap]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x66 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[sqes]="0x66"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[sqes]=0x66 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x44 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[cqes]="0x44"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[cqes]=0x44 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[maxcmd]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[maxcmd]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 256 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[nn]="256"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[nn]=256 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x15d ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[oncs]="0x15d"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[oncs]=0x15d 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[fuses]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[fuses]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[fna]="0"' 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[fna]=0 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:11.851 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[vwc]="0x7"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[vwc]=0x7 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[awun]="0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[awun]=0 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[awupf]="0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[awupf]=0 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[icsvscc]="0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[icsvscc]=0 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[nwpc]="0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[nwpc]=0 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[acwu]="0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[acwu]=0 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[ocfs]="0x3"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[ocfs]=0x3 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[sgls]="0x1"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[sgls]=0x1 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[mnan]="0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[mnan]=0 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[maxdna]="0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[maxdna]=0 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[maxcna]="0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[maxcna]=0 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n nqn.2019-08.org.qemu:12340 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[subnqn]="nqn.2019-08.org.qemu:12340"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[subnqn]=nqn.2019-08.org.qemu:12340 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[ioccsz]="0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[ioccsz]=0 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[iorcsz]="0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[iorcsz]=0 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[icdoff]="0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[icdoff]=0 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[fcatt]="0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[fcatt]=0 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[msdbd]="0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[msdbd]=0 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[ofcs]="0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[ofcs]=0 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[ps0]="mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[ps0]='mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 rwl:0 idle_power:- active_power:- ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[rwt]="0 rwl:0 idle_power:- active_power:-"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[rwt]='0 rwl:0 idle_power:- active_power:-' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n - ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1[active_power_workload]="-"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1[active_power_workload]=- 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@53 -- # local -n _ctrl_ns=nvme1_ns 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme1/ng1n1 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@56 -- # ns_dev=ng1n1 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@57 -- # nvme_get ng1n1 id-ns /dev/ng1n1 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@17 -- # local ref=ng1n1 reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@18 -- # shift 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@20 -- # local -gA 'ng1n1=()' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/ng1n1 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x17a17a ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nsze]="0x17a17a"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nsze]=0x17a17a 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x17a17a ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[ncap]="0x17a17a"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[ncap]=0x17a17a 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x17a17a ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nuse]="0x17a17a"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nuse]=0x17a17a 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nsfeat]="0x14"' 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nsfeat]=0x14 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.852 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nlbaf]="7"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nlbaf]=7 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[flbas]="0x7"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[flbas]=0x7 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[mc]="0x3"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[mc]=0x3 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[dpc]="0x1f"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[dpc]=0x1f 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[dps]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[dps]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nmic]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nmic]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[rescap]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[rescap]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[fpi]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[fpi]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[dlfeat]="1"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[dlfeat]=1 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nawun]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nawun]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nawupf]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nawupf]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nacwu]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nacwu]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nabsn]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nabsn]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nabo]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nabo]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nabspf]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nabspf]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[noiob]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[noiob]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nvmcap]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nvmcap]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[npwg]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[npwg]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[npwa]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[npwa]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[npdg]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[npdg]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[npda]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[npda]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nows]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nows]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[mssrl]="128"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[mssrl]=128 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[mcl]="128"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[mcl]=128 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[msrc]="127"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[msrc]=127 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nulbaf]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nulbaf]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[anagrpid]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[anagrpid]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nsattr]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nsattr]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nvmsetid]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nvmsetid]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[endgid]="0"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[endgid]=0 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[nguid]="00000000000000000000000000000000"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[nguid]=00000000000000000000000000000000 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[eui64]="0000000000000000"' 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[eui64]=0000000000000000 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.853 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf4]="ms:0 lbads:12 rp:0 "' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[lbaf4]='ms:0 lbads:12 rp:0 ' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 (in use) ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf7]="ms:64 lbads:12 rp:0 (in use)"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng1n1[lbaf7]='ms:64 lbads:12 rp:0 (in use)' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=ng1n1 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme1/nvme1n1 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@56 -- # ns_dev=nvme1n1 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@57 -- # nvme_get nvme1n1 id-ns /dev/nvme1n1 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@17 -- # local ref=nvme1n1 reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@18 -- # shift 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@20 -- # local -gA 'nvme1n1=()' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/nvme1n1 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x17a17a ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nsze]="0x17a17a"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nsze]=0x17a17a 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x17a17a ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[ncap]="0x17a17a"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[ncap]=0x17a17a 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x17a17a ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nuse]="0x17a17a"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nuse]=0x17a17a 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nsfeat]="0x14"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nsfeat]=0x14 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nlbaf]="7"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nlbaf]=7 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[flbas]="0x7"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[flbas]=0x7 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[mc]="0x3"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[mc]=0x3 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[dpc]="0x1f"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[dpc]=0x1f 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[dps]="0"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[dps]=0 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nmic]="0"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nmic]=0 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[rescap]="0"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[rescap]=0 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[fpi]="0"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[fpi]=0 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[dlfeat]="1"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[dlfeat]=1 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nawun]="0"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nawun]=0 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nawupf]="0"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nawupf]=0 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nacwu]="0"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nacwu]=0 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nabsn]="0"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nabsn]=0 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nabo]="0"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nabo]=0 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nabspf]="0"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nabspf]=0 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[noiob]="0"' 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[noiob]=0 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.854 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nvmcap]="0"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nvmcap]=0 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[npwg]="0"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[npwg]=0 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[npwa]="0"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[npwa]=0 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[npdg]="0"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[npdg]=0 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[npda]="0"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[npda]=0 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nows]="0"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nows]=0 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[mssrl]="128"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[mssrl]=128 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[mcl]="128"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[mcl]=128 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[msrc]="127"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[msrc]=127 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nulbaf]="0"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nulbaf]=0 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[anagrpid]="0"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[anagrpid]=0 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nsattr]="0"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nsattr]=0 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nvmsetid]="0"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nvmsetid]=0 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[endgid]="0"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[endgid]=0 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[nguid]="00000000000000000000000000000000"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[nguid]=00000000000000000000000000000000 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[eui64]="0000000000000000"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[eui64]=0000000000000000 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf4]="ms:0 lbads:12 rp:0 "' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[lbaf4]='ms:0 lbads:12 rp:0 ' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 (in use) ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf7]="ms:64 lbads:12 rp:0 (in use)"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme1n1[lbaf7]='ms:64 lbads:12 rp:0 (in use)' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=nvme1n1 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@60 -- # ctrls["$ctrl_dev"]=nvme1 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@61 -- # nvmes["$ctrl_dev"]=nvme1_ns 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@62 -- # bdfs["$ctrl_dev"]=0000:00:10.0 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@63 -- # ordered_ctrls[${ctrl_dev/nvme/}]=nvme1 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@47 -- # for ctrl in /sys/class/nvme/nvme* 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@48 -- # [[ -e /sys/class/nvme/nvme2 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@49 -- # pci=0000:00:12.0 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@50 -- # pci_can_use 0000:00:12.0 00:09:11.855 04:59:31 nvme_scc -- scripts/common.sh@18 -- # local i 00:09:11.855 04:59:31 nvme_scc -- scripts/common.sh@21 -- # [[ =~ 0000:00:12.0 ]] 00:09:11.855 04:59:31 nvme_scc -- scripts/common.sh@25 -- # [[ -z '' ]] 00:09:11.855 04:59:31 nvme_scc -- scripts/common.sh@27 -- # return 0 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@51 -- # ctrl_dev=nvme2 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@52 -- # nvme_get nvme2 id-ctrl /dev/nvme2 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@17 -- # local ref=nvme2 reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@18 -- # shift 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@20 -- # local -gA 'nvme2=()' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ctrl /dev/nvme2 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1b36 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[vid]="0x1b36"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[vid]=0x1b36 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1af4 ]] 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[ssvid]="0x1af4"' 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[ssvid]=0x1af4 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.855 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 12342 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[sn]="12342 "' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[sn]='12342 ' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n QEMU NVMe Ctrl ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[mn]="QEMU NVMe Ctrl "' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[mn]='QEMU NVMe Ctrl ' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 8.0.0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[fr]="8.0.0 "' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[fr]='8.0.0 ' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 6 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[rab]="6"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[rab]=6 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 525400 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[ieee]="525400"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[ieee]=525400 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[cmic]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[cmic]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[mdts]="7"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[mdts]=7 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[cntlid]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[cntlid]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x10400 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[ver]="0x10400"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[ver]=0x10400 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[rtd3r]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[rtd3r]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[rtd3e]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[rtd3e]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[oaes]="0x100"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[oaes]=0x100 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x8000 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[ctratt]="0x8000"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[ctratt]=0x8000 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[rrls]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[rrls]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[cntrltype]="1"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[cntrltype]=1 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 00000000-0000-0000-0000-000000000000 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[fguid]="00000000-0000-0000-0000-000000000000"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[fguid]=00000000-0000-0000-0000-000000000000 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[crdt1]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[crdt1]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[crdt2]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[crdt2]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[crdt3]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[crdt3]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[nvmsr]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[nvmsr]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[vwci]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[vwci]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[mec]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[mec]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x12a ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[oacs]="0x12a"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[oacs]=0x12a 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[acl]="3"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[acl]=3 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[aerl]="3"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[aerl]=3 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[frmw]="0x3"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[frmw]=0x3 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[lpa]="0x7"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[lpa]=0x7 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[elpe]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[elpe]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[npss]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[npss]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[avscc]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[avscc]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[apsta]="0"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[apsta]=0 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 343 ]] 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[wctemp]="343"' 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[wctemp]=343 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.856 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 373 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[cctemp]="373"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[cctemp]=373 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[mtfa]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[mtfa]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[hmpre]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[hmpre]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[hmmin]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[hmmin]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[tnvmcap]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[tnvmcap]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[unvmcap]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[unvmcap]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[rpmbs]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[rpmbs]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[edstt]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[edstt]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[dsto]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[dsto]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[fwug]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[fwug]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[kas]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[kas]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[hctma]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[hctma]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[mntmt]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[mntmt]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[mxtmt]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[mxtmt]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[sanicap]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[sanicap]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[hmminds]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[hmminds]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[hmmaxd]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[hmmaxd]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[nsetidmax]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[nsetidmax]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[endgidmax]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[endgidmax]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[anatt]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[anatt]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[anacap]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[anacap]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[anagrpmax]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[anagrpmax]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[nanagrpid]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[nanagrpid]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[pels]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[pels]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[domainid]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[domainid]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[megcap]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[megcap]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x66 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[sqes]="0x66"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[sqes]=0x66 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x44 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[cqes]="0x44"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[cqes]=0x44 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[maxcmd]="0"' 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[maxcmd]=0 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 256 ]] 00:09:11.857 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[nn]="256"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[nn]=256 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x15d ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[oncs]="0x15d"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[oncs]=0x15d 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[fuses]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[fuses]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[fna]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[fna]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[vwc]="0x7"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[vwc]=0x7 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[awun]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[awun]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[awupf]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[awupf]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[icsvscc]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[icsvscc]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[nwpc]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[nwpc]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[acwu]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[acwu]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[ocfs]="0x3"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[ocfs]=0x3 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[sgls]="0x1"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[sgls]=0x1 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[mnan]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[mnan]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[maxdna]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[maxdna]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[maxcna]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[maxcna]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n nqn.2019-08.org.qemu:12342 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[subnqn]="nqn.2019-08.org.qemu:12342"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[subnqn]=nqn.2019-08.org.qemu:12342 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[ioccsz]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[ioccsz]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[iorcsz]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[iorcsz]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[icdoff]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[icdoff]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[fcatt]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[fcatt]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[msdbd]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[msdbd]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[ofcs]="0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[ofcs]=0 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[ps0]="mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[ps0]='mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 rwl:0 idle_power:- active_power:- ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[rwt]="0 rwl:0 idle_power:- active_power:-"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[rwt]='0 rwl:0 idle_power:- active_power:-' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n - ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2[active_power_workload]="-"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2[active_power_workload]=- 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@53 -- # local -n _ctrl_ns=nvme2_ns 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme2/ng2n1 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@56 -- # ns_dev=ng2n1 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@57 -- # nvme_get ng2n1 id-ns /dev/ng2n1 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@17 -- # local ref=ng2n1 reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@18 -- # shift 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@20 -- # local -gA 'ng2n1=()' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/ng2n1 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nsze]="0x100000"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nsze]=0x100000 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[ncap]="0x100000"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[ncap]=0x100000 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nuse]="0x100000"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nuse]=0x100000 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nsfeat]="0x14"' 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nsfeat]=0x14 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.858 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nlbaf]="7"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nlbaf]=7 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[flbas]="0x4"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[flbas]=0x4 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[mc]="0x3"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[mc]=0x3 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[dpc]="0x1f"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[dpc]=0x1f 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[dps]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[dps]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nmic]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nmic]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[rescap]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[rescap]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[fpi]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[fpi]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[dlfeat]="1"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[dlfeat]=1 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nawun]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nawun]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nawupf]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nawupf]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nacwu]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nacwu]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nabsn]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nabsn]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nabo]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nabo]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nabspf]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nabspf]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[noiob]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[noiob]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nvmcap]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nvmcap]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[npwg]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[npwg]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[npwa]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[npwa]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[npdg]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[npdg]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[npda]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[npda]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nows]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nows]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[mssrl]="128"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[mssrl]=128 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[mcl]="128"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[mcl]=128 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[msrc]="127"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[msrc]=127 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nulbaf]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nulbaf]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[anagrpid]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[anagrpid]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nsattr]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nsattr]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nvmsetid]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nvmsetid]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[endgid]="0"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[endgid]=0 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[nguid]="00000000000000000000000000000000"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[nguid]=00000000000000000000000000000000 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[eui64]="0000000000000000"' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[eui64]=0000000000000000 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:11.859 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n1[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=ng2n1 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme2/ng2n2 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@56 -- # ns_dev=ng2n2 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@57 -- # nvme_get ng2n2 id-ns /dev/ng2n2 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@17 -- # local ref=ng2n2 reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@18 -- # shift 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@20 -- # local -gA 'ng2n2=()' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/ng2n2 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nsze]="0x100000"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nsze]=0x100000 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[ncap]="0x100000"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[ncap]=0x100000 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nuse]="0x100000"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nuse]=0x100000 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nsfeat]="0x14"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nsfeat]=0x14 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nlbaf]="7"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nlbaf]=7 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[flbas]="0x4"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[flbas]=0x4 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[mc]="0x3"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[mc]=0x3 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[dpc]="0x1f"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[dpc]=0x1f 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[dps]="0"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[dps]=0 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nmic]="0"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nmic]=0 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[rescap]="0"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[rescap]=0 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[fpi]="0"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[fpi]=0 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[dlfeat]="1"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[dlfeat]=1 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nawun]="0"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nawun]=0 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nawupf]="0"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nawupf]=0 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nacwu]="0"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nacwu]=0 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nabsn]="0"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nabsn]=0 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nabo]="0"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nabo]=0 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nabspf]="0"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nabspf]=0 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[noiob]="0"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[noiob]=0 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nvmcap]="0"' 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nvmcap]=0 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.860 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[npwg]="0"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[npwg]=0 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[npwa]="0"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[npwa]=0 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[npdg]="0"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[npdg]=0 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[npda]="0"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[npda]=0 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nows]="0"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nows]=0 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[mssrl]="128"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[mssrl]=128 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[mcl]="128"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[mcl]=128 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[msrc]="127"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[msrc]=127 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nulbaf]="0"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nulbaf]=0 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[anagrpid]="0"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[anagrpid]=0 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nsattr]="0"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nsattr]=0 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nvmsetid]="0"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nvmsetid]=0 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[endgid]="0"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[endgid]=0 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[nguid]="00000000000000000000000000000000"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[nguid]=00000000000000000000000000000000 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[eui64]="0000000000000000"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[eui64]=0000000000000000 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n2[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=ng2n2 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme2/ng2n3 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@56 -- # ns_dev=ng2n3 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@57 -- # nvme_get ng2n3 id-ns /dev/ng2n3 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@17 -- # local ref=ng2n3 reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@18 -- # shift 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@20 -- # local -gA 'ng2n3=()' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/ng2n3 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nsze]="0x100000"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nsze]=0x100000 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[ncap]="0x100000"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[ncap]=0x100000 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nuse]="0x100000"' 00:09:11.861 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nuse]=0x100000 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nsfeat]="0x14"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nsfeat]=0x14 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nlbaf]="7"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nlbaf]=7 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[flbas]="0x4"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[flbas]=0x4 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[mc]="0x3"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[mc]=0x3 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[dpc]="0x1f"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[dpc]=0x1f 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[dps]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[dps]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nmic]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nmic]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[rescap]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[rescap]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[fpi]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[fpi]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[dlfeat]="1"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[dlfeat]=1 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nawun]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nawun]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nawupf]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nawupf]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nacwu]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nacwu]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nabsn]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nabsn]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nabo]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nabo]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nabspf]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nabspf]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[noiob]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[noiob]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nvmcap]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nvmcap]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[npwg]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[npwg]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[npwa]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[npwa]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[npdg]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[npdg]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[npda]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[npda]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nows]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nows]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[mssrl]="128"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[mssrl]=128 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[mcl]="128"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[mcl]=128 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[msrc]="127"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[msrc]=127 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nulbaf]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nulbaf]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[anagrpid]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[anagrpid]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nsattr]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nsattr]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nvmsetid]="0"' 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nvmsetid]=0 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.862 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[endgid]="0"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[endgid]=0 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[nguid]="00000000000000000000000000000000"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[nguid]=00000000000000000000000000000000 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[eui64]="0000000000000000"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[eui64]=0000000000000000 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # ng2n3[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=ng2n3 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme2/nvme2n1 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@56 -- # ns_dev=nvme2n1 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@57 -- # nvme_get nvme2n1 id-ns /dev/nvme2n1 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@17 -- # local ref=nvme2n1 reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@18 -- # shift 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@20 -- # local -gA 'nvme2n1=()' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/nvme2n1 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nsze]="0x100000"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nsze]=0x100000 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[ncap]="0x100000"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[ncap]=0x100000 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nuse]="0x100000"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nuse]=0x100000 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nsfeat]="0x14"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nsfeat]=0x14 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nlbaf]="7"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nlbaf]=7 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[flbas]="0x4"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[flbas]=0x4 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[mc]="0x3"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[mc]=0x3 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[dpc]="0x1f"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[dpc]=0x1f 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[dps]="0"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[dps]=0 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nmic]="0"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nmic]=0 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[rescap]="0"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[rescap]=0 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[fpi]="0"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[fpi]=0 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[dlfeat]="1"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[dlfeat]=1 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nawun]="0"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nawun]=0 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nawupf]="0"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nawupf]=0 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nacwu]="0"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nacwu]=0 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nabsn]="0"' 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nabsn]=0 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.863 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nabo]="0"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nabo]=0 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nabspf]="0"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nabspf]=0 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[noiob]="0"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[noiob]=0 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nvmcap]="0"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nvmcap]=0 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[npwg]="0"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[npwg]=0 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[npwa]="0"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[npwa]=0 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[npdg]="0"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[npdg]=0 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[npda]="0"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[npda]=0 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nows]="0"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nows]=0 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[mssrl]="128"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[mssrl]=128 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[mcl]="128"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[mcl]=128 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[msrc]="127"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[msrc]=127 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nulbaf]="0"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nulbaf]=0 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[anagrpid]="0"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[anagrpid]=0 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nsattr]="0"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nsattr]=0 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nvmsetid]="0"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nvmsetid]=0 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[endgid]="0"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[endgid]=0 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[nguid]="00000000000000000000000000000000"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[nguid]=00000000000000000000000000000000 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[eui64]="0000000000000000"' 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[eui64]=0000000000000000 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:11.864 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n1[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=nvme2n1 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme2/nvme2n2 ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@56 -- # ns_dev=nvme2n2 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@57 -- # nvme_get nvme2n2 id-ns /dev/nvme2n2 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@17 -- # local ref=nvme2n2 reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@18 -- # shift 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@20 -- # local -gA 'nvme2n2=()' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/nvme2n2 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nsze]="0x100000"' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nsze]=0x100000 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[ncap]="0x100000"' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[ncap]=0x100000 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nuse]="0x100000"' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nuse]=0x100000 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nsfeat]="0x14"' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nsfeat]=0x14 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nlbaf]="7"' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nlbaf]=7 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[flbas]="0x4"' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[flbas]=0x4 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[mc]="0x3"' 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[mc]=0x3 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.127 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[dpc]="0x1f"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[dpc]=0x1f 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[dps]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[dps]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nmic]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nmic]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[rescap]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[rescap]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[fpi]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[fpi]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[dlfeat]="1"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[dlfeat]=1 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nawun]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nawun]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nawupf]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nawupf]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nacwu]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nacwu]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nabsn]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nabsn]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nabo]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nabo]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nabspf]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nabspf]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[noiob]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[noiob]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nvmcap]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nvmcap]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[npwg]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[npwg]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[npwa]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[npwa]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[npdg]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[npdg]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[npda]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[npda]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nows]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nows]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[mssrl]="128"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[mssrl]=128 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[mcl]="128"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[mcl]=128 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[msrc]="127"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[msrc]=127 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nulbaf]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nulbaf]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[anagrpid]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[anagrpid]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nsattr]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nsattr]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nvmsetid]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nvmsetid]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[endgid]="0"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[endgid]=0 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[nguid]="00000000000000000000000000000000"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[nguid]=00000000000000000000000000000000 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[eui64]="0000000000000000"' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[eui64]=0000000000000000 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.128 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n2[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=nvme2n2 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme2/nvme2n3 ]] 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@56 -- # ns_dev=nvme2n3 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@57 -- # nvme_get nvme2n3 id-ns /dev/nvme2n3 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@17 -- # local ref=nvme2n3 reg val 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@18 -- # shift 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@20 -- # local -gA 'nvme2n3=()' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/nvme2n3 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nsze]="0x100000"' 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nsze]=0x100000 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:31 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[ncap]="0x100000"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[ncap]=0x100000 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nuse]="0x100000"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nuse]=0x100000 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nsfeat]="0x14"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nsfeat]=0x14 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nlbaf]="7"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nlbaf]=7 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[flbas]="0x4"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[flbas]=0x4 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[mc]="0x3"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[mc]=0x3 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[dpc]="0x1f"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[dpc]=0x1f 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[dps]="0"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[dps]=0 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nmic]="0"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nmic]=0 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[rescap]="0"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[rescap]=0 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[fpi]="0"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[fpi]=0 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[dlfeat]="1"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[dlfeat]=1 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nawun]="0"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nawun]=0 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nawupf]="0"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nawupf]=0 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nacwu]="0"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nacwu]=0 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nabsn]="0"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nabsn]=0 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nabo]="0"' 00:09:12.129 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nabo]=0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nabspf]="0"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nabspf]=0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[noiob]="0"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[noiob]=0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nvmcap]="0"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nvmcap]=0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[npwg]="0"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[npwg]=0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[npwa]="0"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[npwa]=0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[npdg]="0"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[npdg]=0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[npda]="0"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[npda]=0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nows]="0"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nows]=0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[mssrl]="128"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[mssrl]=128 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[mcl]="128"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[mcl]=128 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[msrc]="127"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[msrc]=127 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nulbaf]="0"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nulbaf]=0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[anagrpid]="0"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[anagrpid]=0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nsattr]="0"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nsattr]=0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nvmsetid]="0"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nvmsetid]=0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[endgid]="0"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[endgid]=0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[nguid]="00000000000000000000000000000000"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[nguid]=00000000000000000000000000000000 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[eui64]="0000000000000000"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[eui64]=0000000000000000 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme2n3[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=nvme2n3 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@60 -- # ctrls["$ctrl_dev"]=nvme2 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@61 -- # nvmes["$ctrl_dev"]=nvme2_ns 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@62 -- # bdfs["$ctrl_dev"]=0000:00:12.0 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@63 -- # ordered_ctrls[${ctrl_dev/nvme/}]=nvme2 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@47 -- # for ctrl in /sys/class/nvme/nvme* 00:09:12.130 04:59:32 nvme_scc -- nvme/functions.sh@48 -- # [[ -e /sys/class/nvme/nvme3 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@49 -- # pci=0000:00:13.0 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@50 -- # pci_can_use 0000:00:13.0 00:09:12.131 04:59:32 nvme_scc -- scripts/common.sh@18 -- # local i 00:09:12.131 04:59:32 nvme_scc -- scripts/common.sh@21 -- # [[ =~ 0000:00:13.0 ]] 00:09:12.131 04:59:32 nvme_scc -- scripts/common.sh@25 -- # [[ -z '' ]] 00:09:12.131 04:59:32 nvme_scc -- scripts/common.sh@27 -- # return 0 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@51 -- # ctrl_dev=nvme3 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@52 -- # nvme_get nvme3 id-ctrl /dev/nvme3 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@17 -- # local ref=nvme3 reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@18 -- # shift 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@20 -- # local -gA 'nvme3=()' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ctrl /dev/nvme3 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1b36 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[vid]="0x1b36"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[vid]=0x1b36 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1af4 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[ssvid]="0x1af4"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[ssvid]=0x1af4 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 12343 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[sn]="12343 "' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[sn]='12343 ' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n QEMU NVMe Ctrl ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[mn]="QEMU NVMe Ctrl "' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[mn]='QEMU NVMe Ctrl ' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 8.0.0 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[fr]="8.0.0 "' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[fr]='8.0.0 ' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 6 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[rab]="6"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[rab]=6 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 525400 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[ieee]="525400"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[ieee]=525400 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x2 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[cmic]="0x2"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[cmic]=0x2 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[mdts]="7"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[mdts]=7 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[cntlid]="0"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[cntlid]=0 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x10400 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[ver]="0x10400"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[ver]=0x10400 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[rtd3r]="0"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[rtd3r]=0 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[rtd3e]="0"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[rtd3e]=0 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x100 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[oaes]="0x100"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[oaes]=0x100 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x88010 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[ctratt]="0x88010"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[ctratt]=0x88010 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[rrls]="0"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[rrls]=0 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[cntrltype]="1"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[cntrltype]=1 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 00000000-0000-0000-0000-000000000000 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[fguid]="00000000-0000-0000-0000-000000000000"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[fguid]=00000000-0000-0000-0000-000000000000 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[crdt1]="0"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[crdt1]=0 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[crdt2]="0"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[crdt2]=0 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[crdt3]="0"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[crdt3]=0 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[nvmsr]="0"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[nvmsr]=0 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[vwci]="0"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[vwci]=0 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[mec]="0"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[mec]=0 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x12a ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[oacs]="0x12a"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[oacs]=0x12a 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[acl]="3"' 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[acl]=3 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:12.131 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[aerl]="3"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[aerl]=3 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[frmw]="0x3"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[frmw]=0x3 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[lpa]="0x7"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[lpa]=0x7 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[elpe]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[elpe]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[npss]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[npss]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[avscc]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[avscc]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[apsta]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[apsta]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 343 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[wctemp]="343"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[wctemp]=343 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 373 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[cctemp]="373"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[cctemp]=373 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[mtfa]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[mtfa]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[hmpre]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[hmpre]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[hmmin]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[hmmin]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[tnvmcap]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[tnvmcap]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[unvmcap]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[unvmcap]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[rpmbs]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[rpmbs]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[edstt]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[edstt]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[dsto]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[dsto]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[fwug]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[fwug]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[kas]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[kas]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[hctma]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[hctma]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[mntmt]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[mntmt]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[mxtmt]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[mxtmt]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[sanicap]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[sanicap]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[hmminds]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[hmminds]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[hmmaxd]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[hmmaxd]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[nsetidmax]="0"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[nsetidmax]=0 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[endgidmax]="1"' 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[endgidmax]=1 00:09:12.132 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[anatt]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[anatt]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[anacap]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[anacap]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[anagrpmax]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[anagrpmax]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[nanagrpid]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[nanagrpid]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[pels]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[pels]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[domainid]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[domainid]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[megcap]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[megcap]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x66 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[sqes]="0x66"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[sqes]=0x66 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x44 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[cqes]="0x44"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[cqes]=0x44 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[maxcmd]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[maxcmd]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 256 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[nn]="256"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[nn]=256 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x15d ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[oncs]="0x15d"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[oncs]=0x15d 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[fuses]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[fuses]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[fna]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[fna]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[vwc]="0x7"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[vwc]=0x7 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[awun]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[awun]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[awupf]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[awupf]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[icsvscc]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[icsvscc]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[nwpc]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[nwpc]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[acwu]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[acwu]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[ocfs]="0x3"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[ocfs]=0x3 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0x1 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[sgls]="0x1"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[sgls]=0x1 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[mnan]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[mnan]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[maxdna]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[maxdna]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[maxcna]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[maxcna]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n nqn.2019-08.org.qemu:fdp-subsys3 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[subnqn]="nqn.2019-08.org.qemu:fdp-subsys3"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[subnqn]=nqn.2019-08.org.qemu:fdp-subsys3 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[ioccsz]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[ioccsz]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[iorcsz]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[iorcsz]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[icdoff]="0"' 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[icdoff]=0 00:09:12.133 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[fcatt]="0"' 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[fcatt]=0 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[msdbd]="0"' 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[msdbd]=0 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[ofcs]="0"' 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[ofcs]=0 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0 ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[ps0]="mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0"' 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[ps0]='mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0' 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n 0 rwl:0 idle_power:- active_power:- ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[rwt]="0 rwl:0 idle_power:- active_power:-"' 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[rwt]='0 rwl:0 idle_power:- active_power:-' 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@22 -- # [[ -n - ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # eval 'nvme3[active_power_workload]="-"' 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@23 -- # nvme3[active_power_workload]=- 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # IFS=: 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@21 -- # read -r reg val 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@53 -- # local -n _ctrl_ns=nvme3_ns 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@60 -- # ctrls["$ctrl_dev"]=nvme3 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@61 -- # nvmes["$ctrl_dev"]=nvme3_ns 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@62 -- # bdfs["$ctrl_dev"]=0000:00:13.0 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@63 -- # ordered_ctrls[${ctrl_dev/nvme/}]=nvme3 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@65 -- # (( 4 > 0 )) 00:09:12.134 04:59:32 nvme_scc -- nvme/nvme_scc.sh@17 -- # get_ctrl_with_feature scc 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@204 -- # local _ctrls feature=scc 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@206 -- # _ctrls=($(get_ctrls_with_feature "$feature")) 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@206 -- # get_ctrls_with_feature scc 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@192 -- # (( 4 == 0 )) 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@194 -- # local ctrl feature=scc 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@196 -- # type -t ctrl_has_scc 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@196 -- # [[ function == function ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@198 -- # for ctrl in "${!ctrls[@]}" 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@199 -- # ctrl_has_scc nvme1 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@184 -- # local ctrl=nvme1 oncs 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@186 -- # get_oncs nvme1 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@171 -- # local ctrl=nvme1 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@172 -- # get_nvme_ctrl_feature nvme1 oncs 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@69 -- # local ctrl=nvme1 reg=oncs 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@71 -- # [[ -n nvme1 ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@73 -- # local -n _ctrl=nvme1 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@75 -- # [[ -n 0x15d ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@76 -- # echo 0x15d 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@186 -- # oncs=0x15d 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@188 -- # (( oncs & 1 << 8 )) 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@199 -- # echo nvme1 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@198 -- # for ctrl in "${!ctrls[@]}" 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@199 -- # ctrl_has_scc nvme0 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@184 -- # local ctrl=nvme0 oncs 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@186 -- # get_oncs nvme0 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@171 -- # local ctrl=nvme0 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@172 -- # get_nvme_ctrl_feature nvme0 oncs 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@69 -- # local ctrl=nvme0 reg=oncs 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@71 -- # [[ -n nvme0 ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@73 -- # local -n _ctrl=nvme0 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@75 -- # [[ -n 0x15d ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@76 -- # echo 0x15d 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@186 -- # oncs=0x15d 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@188 -- # (( oncs & 1 << 8 )) 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@199 -- # echo nvme0 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@198 -- # for ctrl in "${!ctrls[@]}" 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@199 -- # ctrl_has_scc nvme3 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@184 -- # local ctrl=nvme3 oncs 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@186 -- # get_oncs nvme3 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@171 -- # local ctrl=nvme3 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@172 -- # get_nvme_ctrl_feature nvme3 oncs 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@69 -- # local ctrl=nvme3 reg=oncs 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@71 -- # [[ -n nvme3 ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@73 -- # local -n _ctrl=nvme3 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@75 -- # [[ -n 0x15d ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@76 -- # echo 0x15d 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@186 -- # oncs=0x15d 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@188 -- # (( oncs & 1 << 8 )) 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@199 -- # echo nvme3 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@198 -- # for ctrl in "${!ctrls[@]}" 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@199 -- # ctrl_has_scc nvme2 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@184 -- # local ctrl=nvme2 oncs 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@186 -- # get_oncs nvme2 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@171 -- # local ctrl=nvme2 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@172 -- # get_nvme_ctrl_feature nvme2 oncs 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@69 -- # local ctrl=nvme2 reg=oncs 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@71 -- # [[ -n nvme2 ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@73 -- # local -n _ctrl=nvme2 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@75 -- # [[ -n 0x15d ]] 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@76 -- # echo 0x15d 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@186 -- # oncs=0x15d 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@188 -- # (( oncs & 1 << 8 )) 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@199 -- # echo nvme2 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@207 -- # (( 4 > 0 )) 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@208 -- # echo nvme1 00:09:12.134 04:59:32 nvme_scc -- nvme/functions.sh@209 -- # return 0 00:09:12.134 04:59:32 nvme_scc -- nvme/nvme_scc.sh@17 -- # ctrl=nvme1 00:09:12.134 04:59:32 nvme_scc -- nvme/nvme_scc.sh@17 -- # bdf=0000:00:10.0 00:09:12.134 04:59:32 nvme_scc -- nvme/nvme_scc.sh@19 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:09:12.392 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:09:12.961 0000:00:11.0 (1b36 0010): nvme -> uio_pci_generic 00:09:12.961 0000:00:10.0 (1b36 0010): nvme -> uio_pci_generic 00:09:12.961 0000:00:13.0 (1b36 0010): nvme -> uio_pci_generic 00:09:12.961 0000:00:12.0 (1b36 0010): nvme -> uio_pci_generic 00:09:12.961 04:59:33 nvme_scc -- nvme/nvme_scc.sh@21 -- # run_test nvme_simple_copy /home/vagrant/spdk_repo/spdk/test/nvme/simple_copy/simple_copy -r 'trtype:pcie traddr:0000:00:10.0' 00:09:12.961 04:59:33 nvme_scc -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:09:12.961 04:59:33 nvme_scc -- common/autotest_common.sh@1111 -- # xtrace_disable 00:09:12.961 04:59:33 nvme_scc -- common/autotest_common.sh@10 -- # set +x 00:09:12.961 ************************************ 00:09:12.961 START TEST nvme_simple_copy 00:09:12.961 ************************************ 00:09:12.961 04:59:33 nvme_scc.nvme_simple_copy -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/simple_copy/simple_copy -r 'trtype:pcie traddr:0000:00:10.0' 00:09:13.220 Initializing NVMe Controllers 00:09:13.220 Attaching to 0000:00:10.0 00:09:13.220 Controller supports SCC. Attached to 0000:00:10.0 00:09:13.220 Namespace ID: 1 size: 6GB 00:09:13.220 Initialization complete. 00:09:13.220 00:09:13.220 Controller QEMU NVMe Ctrl (12340 ) 00:09:13.220 Controller PCI vendor:6966 PCI subsystem vendor:6900 00:09:13.220 Namespace Block Size:4096 00:09:13.220 Writing LBAs 0 to 63 with Random Data 00:09:13.220 Copied LBAs from 0 - 63 to the Destination LBA 256 00:09:13.220 LBAs matching Written Data: 64 00:09:13.220 00:09:13.220 real 0m0.242s 00:09:13.220 user 0m0.085s 00:09:13.220 sys 0m0.056s 00:09:13.220 ************************************ 00:09:13.220 END TEST nvme_simple_copy 00:09:13.220 ************************************ 00:09:13.220 04:59:33 nvme_scc.nvme_simple_copy -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:13.220 04:59:33 nvme_scc.nvme_simple_copy -- common/autotest_common.sh@10 -- # set +x 00:09:13.478 ************************************ 00:09:13.479 END TEST nvme_scc 00:09:13.479 ************************************ 00:09:13.479 00:09:13.479 real 0m7.508s 00:09:13.479 user 0m1.071s 00:09:13.479 sys 0m1.331s 00:09:13.479 04:59:33 nvme_scc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:13.479 04:59:33 nvme_scc -- common/autotest_common.sh@10 -- # set +x 00:09:13.479 04:59:33 -- spdk/autotest.sh@219 -- # [[ 0 -eq 1 ]] 00:09:13.479 04:59:33 -- spdk/autotest.sh@222 -- # [[ 0 -eq 1 ]] 00:09:13.479 04:59:33 -- spdk/autotest.sh@225 -- # [[ '' -eq 1 ]] 00:09:13.479 04:59:33 -- spdk/autotest.sh@228 -- # [[ 1 -eq 1 ]] 00:09:13.479 04:59:33 -- spdk/autotest.sh@229 -- # run_test nvme_fdp test/nvme/nvme_fdp.sh 00:09:13.479 04:59:33 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:09:13.479 04:59:33 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:09:13.479 04:59:33 -- common/autotest_common.sh@10 -- # set +x 00:09:13.479 ************************************ 00:09:13.479 START TEST nvme_fdp 00:09:13.479 ************************************ 00:09:13.479 04:59:33 nvme_fdp -- common/autotest_common.sh@1129 -- # test/nvme/nvme_fdp.sh 00:09:13.479 * Looking for test storage... 00:09:13.479 * Found test storage at /home/vagrant/spdk_repo/spdk/test/nvme 00:09:13.479 04:59:33 nvme_fdp -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:09:13.479 04:59:33 nvme_fdp -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:09:13.479 04:59:33 nvme_fdp -- common/autotest_common.sh@1711 -- # lcov --version 00:09:13.479 04:59:33 nvme_fdp -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@336 -- # IFS=.-: 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@336 -- # read -ra ver1 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@337 -- # IFS=.-: 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@337 -- # read -ra ver2 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@338 -- # local 'op=<' 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@340 -- # ver1_l=2 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@341 -- # ver2_l=1 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@344 -- # case "$op" in 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@345 -- # : 1 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@365 -- # decimal 1 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@353 -- # local d=1 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@355 -- # echo 1 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@365 -- # ver1[v]=1 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@366 -- # decimal 2 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@353 -- # local d=2 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@355 -- # echo 2 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@366 -- # ver2[v]=2 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@368 -- # return 0 00:09:13.479 04:59:33 nvme_fdp -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:13.479 04:59:33 nvme_fdp -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:09:13.479 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:13.479 --rc genhtml_branch_coverage=1 00:09:13.479 --rc genhtml_function_coverage=1 00:09:13.479 --rc genhtml_legend=1 00:09:13.479 --rc geninfo_all_blocks=1 00:09:13.479 --rc geninfo_unexecuted_blocks=1 00:09:13.479 00:09:13.479 ' 00:09:13.479 04:59:33 nvme_fdp -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:09:13.479 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:13.479 --rc genhtml_branch_coverage=1 00:09:13.479 --rc genhtml_function_coverage=1 00:09:13.479 --rc genhtml_legend=1 00:09:13.479 --rc geninfo_all_blocks=1 00:09:13.479 --rc geninfo_unexecuted_blocks=1 00:09:13.479 00:09:13.479 ' 00:09:13.479 04:59:33 nvme_fdp -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:09:13.479 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:13.479 --rc genhtml_branch_coverage=1 00:09:13.479 --rc genhtml_function_coverage=1 00:09:13.479 --rc genhtml_legend=1 00:09:13.479 --rc geninfo_all_blocks=1 00:09:13.479 --rc geninfo_unexecuted_blocks=1 00:09:13.479 00:09:13.479 ' 00:09:13.479 04:59:33 nvme_fdp -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:09:13.479 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:13.479 --rc genhtml_branch_coverage=1 00:09:13.479 --rc genhtml_function_coverage=1 00:09:13.479 --rc genhtml_legend=1 00:09:13.479 --rc geninfo_all_blocks=1 00:09:13.479 --rc geninfo_unexecuted_blocks=1 00:09:13.479 00:09:13.479 ' 00:09:13.479 04:59:33 nvme_fdp -- cuse/common.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/common/nvme/functions.sh 00:09:13.479 04:59:33 nvme_fdp -- nvme/functions.sh@7 -- # dirname /home/vagrant/spdk_repo/spdk/test/common/nvme/functions.sh 00:09:13.479 04:59:33 nvme_fdp -- nvme/functions.sh@7 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/common/nvme/../../../ 00:09:13.479 04:59:33 nvme_fdp -- nvme/functions.sh@7 -- # rootdir=/home/vagrant/spdk_repo/spdk 00:09:13.479 04:59:33 nvme_fdp -- nvme/functions.sh@8 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@15 -- # shopt -s extglob 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:09:13.479 04:59:33 nvme_fdp -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:09:13.479 04:59:33 nvme_fdp -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:13.479 04:59:33 nvme_fdp -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:13.479 04:59:33 nvme_fdp -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:13.479 04:59:33 nvme_fdp -- paths/export.sh@5 -- # export PATH 00:09:13.479 04:59:33 nvme_fdp -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:09:13.479 04:59:33 nvme_fdp -- nvme/functions.sh@10 -- # ctrls=() 00:09:13.479 04:59:33 nvme_fdp -- nvme/functions.sh@10 -- # declare -A ctrls 00:09:13.479 04:59:33 nvme_fdp -- nvme/functions.sh@11 -- # nvmes=() 00:09:13.479 04:59:33 nvme_fdp -- nvme/functions.sh@11 -- # declare -A nvmes 00:09:13.479 04:59:33 nvme_fdp -- nvme/functions.sh@12 -- # bdfs=() 00:09:13.479 04:59:33 nvme_fdp -- nvme/functions.sh@12 -- # declare -A bdfs 00:09:13.479 04:59:33 nvme_fdp -- nvme/functions.sh@13 -- # ordered_ctrls=() 00:09:13.479 04:59:33 nvme_fdp -- nvme/functions.sh@13 -- # declare -a ordered_ctrls 00:09:13.479 04:59:33 nvme_fdp -- nvme/functions.sh@14 -- # nvme_name= 00:09:13.479 04:59:33 nvme_fdp -- cuse/common.sh@11 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:09:13.479 04:59:33 nvme_fdp -- nvme/nvme_fdp.sh@10 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:09:13.737 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:09:13.995 Waiting for block devices as requested 00:09:13.995 0000:00:11.0 (1b36 0010): uio_pci_generic -> nvme 00:09:13.996 0000:00:10.0 (1b36 0010): uio_pci_generic -> nvme 00:09:14.254 0000:00:12.0 (1b36 0010): uio_pci_generic -> nvme 00:09:14.254 0000:00:13.0 (1b36 0010): uio_pci_generic -> nvme 00:09:19.538 * Events for some block/disk devices (0000:00:13.0) were not caught, they may be missing 00:09:19.538 04:59:39 nvme_fdp -- nvme/nvme_fdp.sh@12 -- # scan_nvme_ctrls 00:09:19.538 04:59:39 nvme_fdp -- nvme/functions.sh@45 -- # local ctrl ctrl_dev reg val ns pci 00:09:19.538 04:59:39 nvme_fdp -- nvme/functions.sh@47 -- # for ctrl in /sys/class/nvme/nvme* 00:09:19.538 04:59:39 nvme_fdp -- nvme/functions.sh@48 -- # [[ -e /sys/class/nvme/nvme0 ]] 00:09:19.538 04:59:39 nvme_fdp -- nvme/functions.sh@49 -- # pci=0000:00:11.0 00:09:19.538 04:59:39 nvme_fdp -- nvme/functions.sh@50 -- # pci_can_use 0000:00:11.0 00:09:19.538 04:59:39 nvme_fdp -- scripts/common.sh@18 -- # local i 00:09:19.538 04:59:39 nvme_fdp -- scripts/common.sh@21 -- # [[ =~ 0000:00:11.0 ]] 00:09:19.539 04:59:39 nvme_fdp -- scripts/common.sh@25 -- # [[ -z '' ]] 00:09:19.539 04:59:39 nvme_fdp -- scripts/common.sh@27 -- # return 0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@51 -- # ctrl_dev=nvme0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@52 -- # nvme_get nvme0 id-ctrl /dev/nvme0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@17 -- # local ref=nvme0 reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@18 -- # shift 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@20 -- # local -gA 'nvme0=()' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ctrl /dev/nvme0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1b36 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[vid]="0x1b36"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[vid]=0x1b36 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1af4 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[ssvid]="0x1af4"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[ssvid]=0x1af4 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 12341 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[sn]="12341 "' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[sn]='12341 ' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n QEMU NVMe Ctrl ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[mn]="QEMU NVMe Ctrl "' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[mn]='QEMU NVMe Ctrl ' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 8.0.0 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[fr]="8.0.0 "' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[fr]='8.0.0 ' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 6 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[rab]="6"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[rab]=6 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 525400 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[ieee]="525400"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[ieee]=525400 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[cmic]="0"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[cmic]=0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[mdts]="7"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[mdts]=7 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[cntlid]="0"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[cntlid]=0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x10400 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[ver]="0x10400"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[ver]=0x10400 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[rtd3r]="0"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[rtd3r]=0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[rtd3e]="0"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[rtd3e]=0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[oaes]="0x100"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[oaes]=0x100 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x8000 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[ctratt]="0x8000"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[ctratt]=0x8000 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[rrls]="0"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[rrls]=0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[cntrltype]="1"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[cntrltype]=1 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 00000000-0000-0000-0000-000000000000 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[fguid]="00000000-0000-0000-0000-000000000000"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[fguid]=00000000-0000-0000-0000-000000000000 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[crdt1]="0"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[crdt1]=0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[crdt2]="0"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[crdt2]=0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[crdt3]="0"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[crdt3]=0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[nvmsr]="0"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[nvmsr]=0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[vwci]="0"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[vwci]=0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[mec]="0"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[mec]=0 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x12a ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[oacs]="0x12a"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[oacs]=0x12a 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[acl]="3"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[acl]=3 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[aerl]="3"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[aerl]=3 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[frmw]="0x3"' 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[frmw]=0x3 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.539 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[lpa]="0x7"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[lpa]=0x7 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[elpe]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[elpe]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[npss]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[npss]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[avscc]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[avscc]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[apsta]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[apsta]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 343 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[wctemp]="343"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[wctemp]=343 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 373 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[cctemp]="373"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[cctemp]=373 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[mtfa]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[mtfa]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[hmpre]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[hmpre]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[hmmin]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[hmmin]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[tnvmcap]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[tnvmcap]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[unvmcap]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[unvmcap]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[rpmbs]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[rpmbs]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[edstt]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[edstt]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[dsto]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[dsto]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[fwug]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[fwug]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[kas]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[kas]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[hctma]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[hctma]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[mntmt]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[mntmt]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[mxtmt]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[mxtmt]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[sanicap]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[sanicap]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[hmminds]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[hmminds]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[hmmaxd]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[hmmaxd]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[nsetidmax]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[nsetidmax]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[endgidmax]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[endgidmax]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[anatt]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[anatt]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[anacap]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[anacap]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[anagrpmax]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[anagrpmax]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[nanagrpid]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[nanagrpid]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[pels]="0"' 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[pels]=0 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.540 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[domainid]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[domainid]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[megcap]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[megcap]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x66 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[sqes]="0x66"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[sqes]=0x66 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x44 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[cqes]="0x44"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[cqes]=0x44 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[maxcmd]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[maxcmd]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 256 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[nn]="256"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[nn]=256 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x15d ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[oncs]="0x15d"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[oncs]=0x15d 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[fuses]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[fuses]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[fna]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[fna]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[vwc]="0x7"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[vwc]=0x7 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[awun]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[awun]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[awupf]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[awupf]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[icsvscc]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[icsvscc]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[nwpc]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[nwpc]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[acwu]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[acwu]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[ocfs]="0x3"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[ocfs]=0x3 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[sgls]="0x1"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[sgls]=0x1 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[mnan]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[mnan]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[maxdna]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[maxdna]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[maxcna]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[maxcna]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n nqn.2019-08.org.qemu:12341 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[subnqn]="nqn.2019-08.org.qemu:12341"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[subnqn]=nqn.2019-08.org.qemu:12341 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[ioccsz]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[ioccsz]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[iorcsz]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[iorcsz]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[icdoff]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[icdoff]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[fcatt]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[fcatt]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[msdbd]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[msdbd]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[ofcs]="0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[ofcs]=0 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[ps0]="mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[ps0]='mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 rwl:0 idle_power:- active_power:- ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[rwt]="0 rwl:0 idle_power:- active_power:-"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[rwt]='0 rwl:0 idle_power:- active_power:-' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n - ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0[active_power_workload]="-"' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0[active_power_workload]=- 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@53 -- # local -n _ctrl_ns=nvme0_ns 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme0/ng0n1 ]] 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@56 -- # ns_dev=ng0n1 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@57 -- # nvme_get ng0n1 id-ns /dev/ng0n1 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@17 -- # local ref=ng0n1 reg val 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@18 -- # shift 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@20 -- # local -gA 'ng0n1=()' 00:09:19.541 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/ng0n1 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x140000 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nsze]="0x140000"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nsze]=0x140000 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x140000 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[ncap]="0x140000"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[ncap]=0x140000 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x140000 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nuse]="0x140000"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nuse]=0x140000 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nsfeat]="0x14"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nsfeat]=0x14 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nlbaf]="7"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nlbaf]=7 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[flbas]="0x4"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[flbas]=0x4 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[mc]="0x3"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[mc]=0x3 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[dpc]="0x1f"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[dpc]=0x1f 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[dps]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[dps]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nmic]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nmic]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[rescap]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[rescap]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[fpi]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[fpi]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[dlfeat]="1"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[dlfeat]=1 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nawun]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nawun]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nawupf]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nawupf]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nacwu]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nacwu]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nabsn]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nabsn]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nabo]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nabo]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nabspf]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nabspf]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[noiob]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[noiob]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nvmcap]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nvmcap]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[npwg]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[npwg]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[npwa]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[npwa]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[npdg]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[npdg]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[npda]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[npda]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nows]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nows]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[mssrl]="128"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[mssrl]=128 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[mcl]="128"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[mcl]=128 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[msrc]="127"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[msrc]=127 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nulbaf]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nulbaf]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[anagrpid]="0"' 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[anagrpid]=0 00:09:19.542 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nsattr]="0"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nsattr]=0 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nvmsetid]="0"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nvmsetid]=0 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[endgid]="0"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[endgid]=0 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[nguid]="00000000000000000000000000000000"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[nguid]=00000000000000000000000000000000 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[eui64]="0000000000000000"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[eui64]=0000000000000000 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng0n1[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng0n1[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=ng0n1 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme0/nvme0n1 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@56 -- # ns_dev=nvme0n1 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@57 -- # nvme_get nvme0n1 id-ns /dev/nvme0n1 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@17 -- # local ref=nvme0n1 reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@18 -- # shift 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@20 -- # local -gA 'nvme0n1=()' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/nvme0n1 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x140000 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nsze]="0x140000"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nsze]=0x140000 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x140000 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[ncap]="0x140000"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[ncap]=0x140000 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x140000 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nuse]="0x140000"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nuse]=0x140000 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nsfeat]="0x14"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nsfeat]=0x14 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nlbaf]="7"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nlbaf]=7 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[flbas]="0x4"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[flbas]=0x4 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[mc]="0x3"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[mc]=0x3 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[dpc]="0x1f"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[dpc]=0x1f 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[dps]="0"' 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[dps]=0 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.543 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nmic]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nmic]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[rescap]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[rescap]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[fpi]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[fpi]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[dlfeat]="1"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[dlfeat]=1 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nawun]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nawun]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nawupf]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nawupf]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nacwu]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nacwu]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nabsn]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nabsn]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nabo]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nabo]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nabspf]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nabspf]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[noiob]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[noiob]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nvmcap]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nvmcap]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[npwg]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[npwg]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[npwa]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[npwa]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[npdg]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[npdg]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[npda]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[npda]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nows]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nows]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[mssrl]="128"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[mssrl]=128 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[mcl]="128"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[mcl]=128 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[msrc]="127"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[msrc]=127 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nulbaf]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nulbaf]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[anagrpid]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[anagrpid]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nsattr]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nsattr]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nvmsetid]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nvmsetid]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[endgid]="0"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[endgid]=0 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[nguid]="00000000000000000000000000000000"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[nguid]=00000000000000000000000000000000 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[eui64]="0000000000000000"' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[eui64]=0000000000000000 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:19.544 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme0n1[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme0n1[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=nvme0n1 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@60 -- # ctrls["$ctrl_dev"]=nvme0 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@61 -- # nvmes["$ctrl_dev"]=nvme0_ns 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@62 -- # bdfs["$ctrl_dev"]=0000:00:11.0 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@63 -- # ordered_ctrls[${ctrl_dev/nvme/}]=nvme0 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@47 -- # for ctrl in /sys/class/nvme/nvme* 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@48 -- # [[ -e /sys/class/nvme/nvme1 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@49 -- # pci=0000:00:10.0 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@50 -- # pci_can_use 0000:00:10.0 00:09:19.545 04:59:39 nvme_fdp -- scripts/common.sh@18 -- # local i 00:09:19.545 04:59:39 nvme_fdp -- scripts/common.sh@21 -- # [[ =~ 0000:00:10.0 ]] 00:09:19.545 04:59:39 nvme_fdp -- scripts/common.sh@25 -- # [[ -z '' ]] 00:09:19.545 04:59:39 nvme_fdp -- scripts/common.sh@27 -- # return 0 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@51 -- # ctrl_dev=nvme1 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@52 -- # nvme_get nvme1 id-ctrl /dev/nvme1 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@17 -- # local ref=nvme1 reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@18 -- # shift 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@20 -- # local -gA 'nvme1=()' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ctrl /dev/nvme1 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1b36 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[vid]="0x1b36"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[vid]=0x1b36 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1af4 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[ssvid]="0x1af4"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[ssvid]=0x1af4 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 12340 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[sn]="12340 "' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[sn]='12340 ' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n QEMU NVMe Ctrl ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[mn]="QEMU NVMe Ctrl "' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[mn]='QEMU NVMe Ctrl ' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 8.0.0 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[fr]="8.0.0 "' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[fr]='8.0.0 ' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 6 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[rab]="6"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[rab]=6 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 525400 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[ieee]="525400"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[ieee]=525400 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[cmic]="0"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[cmic]=0 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[mdts]="7"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[mdts]=7 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[cntlid]="0"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[cntlid]=0 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x10400 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[ver]="0x10400"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[ver]=0x10400 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[rtd3r]="0"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[rtd3r]=0 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[rtd3e]="0"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[rtd3e]=0 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[oaes]="0x100"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[oaes]=0x100 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x8000 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[ctratt]="0x8000"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[ctratt]=0x8000 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[rrls]="0"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[rrls]=0 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[cntrltype]="1"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[cntrltype]=1 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 00000000-0000-0000-0000-000000000000 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[fguid]="00000000-0000-0000-0000-000000000000"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[fguid]=00000000-0000-0000-0000-000000000000 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[crdt1]="0"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[crdt1]=0 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[crdt2]="0"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[crdt2]=0 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[crdt3]="0"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[crdt3]=0 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[nvmsr]="0"' 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[nvmsr]=0 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.545 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[vwci]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[vwci]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[mec]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[mec]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x12a ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[oacs]="0x12a"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[oacs]=0x12a 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[acl]="3"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[acl]=3 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[aerl]="3"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[aerl]=3 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[frmw]="0x3"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[frmw]=0x3 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[lpa]="0x7"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[lpa]=0x7 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[elpe]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[elpe]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[npss]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[npss]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[avscc]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[avscc]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[apsta]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[apsta]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 343 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[wctemp]="343"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[wctemp]=343 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 373 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[cctemp]="373"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[cctemp]=373 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[mtfa]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[mtfa]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[hmpre]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[hmpre]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[hmmin]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[hmmin]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[tnvmcap]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[tnvmcap]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[unvmcap]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[unvmcap]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[rpmbs]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[rpmbs]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[edstt]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[edstt]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[dsto]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[dsto]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[fwug]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[fwug]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[kas]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[kas]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[hctma]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[hctma]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[mntmt]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[mntmt]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[mxtmt]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[mxtmt]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[sanicap]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[sanicap]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[hmminds]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[hmminds]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[hmmaxd]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[hmmaxd]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[nsetidmax]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[nsetidmax]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[endgidmax]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[endgidmax]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[anatt]="0"' 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[anatt]=0 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.546 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[anacap]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[anacap]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[anagrpmax]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[anagrpmax]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[nanagrpid]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[nanagrpid]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[pels]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[pels]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[domainid]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[domainid]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[megcap]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[megcap]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x66 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[sqes]="0x66"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[sqes]=0x66 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x44 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[cqes]="0x44"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[cqes]=0x44 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[maxcmd]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[maxcmd]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 256 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[nn]="256"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[nn]=256 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x15d ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[oncs]="0x15d"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[oncs]=0x15d 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[fuses]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[fuses]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[fna]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[fna]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[vwc]="0x7"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[vwc]=0x7 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[awun]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[awun]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[awupf]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[awupf]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[icsvscc]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[icsvscc]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[nwpc]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[nwpc]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[acwu]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[acwu]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[ocfs]="0x3"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[ocfs]=0x3 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[sgls]="0x1"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[sgls]=0x1 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[mnan]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[mnan]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[maxdna]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[maxdna]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[maxcna]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[maxcna]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n nqn.2019-08.org.qemu:12340 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[subnqn]="nqn.2019-08.org.qemu:12340"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[subnqn]=nqn.2019-08.org.qemu:12340 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[ioccsz]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[ioccsz]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[iorcsz]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[iorcsz]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[icdoff]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[icdoff]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[fcatt]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[fcatt]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[msdbd]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[msdbd]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[ofcs]="0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[ofcs]=0 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0 ]] 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[ps0]="mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0"' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[ps0]='mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0' 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.547 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 rwl:0 idle_power:- active_power:- ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[rwt]="0 rwl:0 idle_power:- active_power:-"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[rwt]='0 rwl:0 idle_power:- active_power:-' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n - ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1[active_power_workload]="-"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1[active_power_workload]=- 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@53 -- # local -n _ctrl_ns=nvme1_ns 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme1/ng1n1 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@56 -- # ns_dev=ng1n1 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@57 -- # nvme_get ng1n1 id-ns /dev/ng1n1 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@17 -- # local ref=ng1n1 reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@18 -- # shift 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@20 -- # local -gA 'ng1n1=()' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/ng1n1 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x17a17a ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nsze]="0x17a17a"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nsze]=0x17a17a 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x17a17a ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[ncap]="0x17a17a"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[ncap]=0x17a17a 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x17a17a ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nuse]="0x17a17a"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nuse]=0x17a17a 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nsfeat]="0x14"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nsfeat]=0x14 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nlbaf]="7"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nlbaf]=7 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[flbas]="0x7"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[flbas]=0x7 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[mc]="0x3"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[mc]=0x3 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[dpc]="0x1f"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[dpc]=0x1f 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[dps]="0"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[dps]=0 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nmic]="0"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nmic]=0 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[rescap]="0"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[rescap]=0 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[fpi]="0"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[fpi]=0 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[dlfeat]="1"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[dlfeat]=1 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nawun]="0"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nawun]=0 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nawupf]="0"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nawupf]=0 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nacwu]="0"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nacwu]=0 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nabsn]="0"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nabsn]=0 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nabo]="0"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nabo]=0 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nabspf]="0"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nabspf]=0 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[noiob]="0"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[noiob]=0 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nvmcap]="0"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nvmcap]=0 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[npwg]="0"' 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[npwg]=0 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.548 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[npwa]="0"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[npwa]=0 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[npdg]="0"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[npdg]=0 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[npda]="0"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[npda]=0 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nows]="0"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nows]=0 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[mssrl]="128"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[mssrl]=128 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[mcl]="128"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[mcl]=128 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[msrc]="127"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[msrc]=127 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nulbaf]="0"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nulbaf]=0 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[anagrpid]="0"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[anagrpid]=0 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nsattr]="0"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nsattr]=0 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nvmsetid]="0"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nvmsetid]=0 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[endgid]="0"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[endgid]=0 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[nguid]="00000000000000000000000000000000"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[nguid]=00000000000000000000000000000000 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[eui64]="0000000000000000"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[eui64]=0000000000000000 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf4]="ms:0 lbads:12 rp:0 "' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[lbaf4]='ms:0 lbads:12 rp:0 ' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 (in use) ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng1n1[lbaf7]="ms:64 lbads:12 rp:0 (in use)"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng1n1[lbaf7]='ms:64 lbads:12 rp:0 (in use)' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=ng1n1 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme1/nvme1n1 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@56 -- # ns_dev=nvme1n1 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@57 -- # nvme_get nvme1n1 id-ns /dev/nvme1n1 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@17 -- # local ref=nvme1n1 reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@18 -- # shift 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@20 -- # local -gA 'nvme1n1=()' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/nvme1n1 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x17a17a ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nsze]="0x17a17a"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nsze]=0x17a17a 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x17a17a ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[ncap]="0x17a17a"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[ncap]=0x17a17a 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x17a17a ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nuse]="0x17a17a"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nuse]=0x17a17a 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nsfeat]="0x14"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nsfeat]=0x14 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nlbaf]="7"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nlbaf]=7 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[flbas]="0x7"' 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[flbas]=0x7 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.549 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[mc]="0x3"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[mc]=0x3 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[dpc]="0x1f"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[dpc]=0x1f 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[dps]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[dps]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nmic]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nmic]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[rescap]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[rescap]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[fpi]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[fpi]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[dlfeat]="1"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[dlfeat]=1 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nawun]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nawun]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nawupf]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nawupf]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nacwu]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nacwu]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nabsn]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nabsn]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nabo]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nabo]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nabspf]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nabspf]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[noiob]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[noiob]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nvmcap]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nvmcap]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[npwg]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[npwg]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[npwa]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[npwa]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[npdg]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[npdg]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[npda]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[npda]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nows]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nows]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[mssrl]="128"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[mssrl]=128 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[mcl]="128"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[mcl]=128 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[msrc]="127"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[msrc]=127 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nulbaf]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nulbaf]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[anagrpid]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[anagrpid]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nsattr]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nsattr]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nvmsetid]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nvmsetid]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[endgid]="0"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[endgid]=0 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[nguid]="00000000000000000000000000000000"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[nguid]=00000000000000000000000000000000 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[eui64]="0000000000000000"' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[eui64]=0000000000000000 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:19.550 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf4]="ms:0 lbads:12 rp:0 "' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[lbaf4]='ms:0 lbads:12 rp:0 ' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 (in use) ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme1n1[lbaf7]="ms:64 lbads:12 rp:0 (in use)"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme1n1[lbaf7]='ms:64 lbads:12 rp:0 (in use)' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=nvme1n1 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@60 -- # ctrls["$ctrl_dev"]=nvme1 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@61 -- # nvmes["$ctrl_dev"]=nvme1_ns 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@62 -- # bdfs["$ctrl_dev"]=0000:00:10.0 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@63 -- # ordered_ctrls[${ctrl_dev/nvme/}]=nvme1 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@47 -- # for ctrl in /sys/class/nvme/nvme* 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@48 -- # [[ -e /sys/class/nvme/nvme2 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@49 -- # pci=0000:00:12.0 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@50 -- # pci_can_use 0000:00:12.0 00:09:19.551 04:59:39 nvme_fdp -- scripts/common.sh@18 -- # local i 00:09:19.551 04:59:39 nvme_fdp -- scripts/common.sh@21 -- # [[ =~ 0000:00:12.0 ]] 00:09:19.551 04:59:39 nvme_fdp -- scripts/common.sh@25 -- # [[ -z '' ]] 00:09:19.551 04:59:39 nvme_fdp -- scripts/common.sh@27 -- # return 0 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@51 -- # ctrl_dev=nvme2 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@52 -- # nvme_get nvme2 id-ctrl /dev/nvme2 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@17 -- # local ref=nvme2 reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@18 -- # shift 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@20 -- # local -gA 'nvme2=()' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ctrl /dev/nvme2 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1b36 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[vid]="0x1b36"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[vid]=0x1b36 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1af4 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[ssvid]="0x1af4"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[ssvid]=0x1af4 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 12342 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[sn]="12342 "' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[sn]='12342 ' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n QEMU NVMe Ctrl ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[mn]="QEMU NVMe Ctrl "' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[mn]='QEMU NVMe Ctrl ' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 8.0.0 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[fr]="8.0.0 "' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[fr]='8.0.0 ' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 6 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[rab]="6"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[rab]=6 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 525400 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[ieee]="525400"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[ieee]=525400 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[cmic]="0"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[cmic]=0 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[mdts]="7"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[mdts]=7 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[cntlid]="0"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[cntlid]=0 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x10400 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[ver]="0x10400"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[ver]=0x10400 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[rtd3r]="0"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[rtd3r]=0 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[rtd3e]="0"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[rtd3e]=0 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[oaes]="0x100"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[oaes]=0x100 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x8000 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[ctratt]="0x8000"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[ctratt]=0x8000 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[rrls]="0"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[rrls]=0 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[cntrltype]="1"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[cntrltype]=1 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 00000000-0000-0000-0000-000000000000 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[fguid]="00000000-0000-0000-0000-000000000000"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[fguid]=00000000-0000-0000-0000-000000000000 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[crdt1]="0"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[crdt1]=0 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[crdt2]="0"' 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[crdt2]=0 00:09:19.551 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[crdt3]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[crdt3]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[nvmsr]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[nvmsr]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[vwci]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[vwci]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[mec]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[mec]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x12a ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[oacs]="0x12a"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[oacs]=0x12a 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[acl]="3"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[acl]=3 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[aerl]="3"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[aerl]=3 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[frmw]="0x3"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[frmw]=0x3 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[lpa]="0x7"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[lpa]=0x7 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[elpe]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[elpe]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[npss]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[npss]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[avscc]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[avscc]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[apsta]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[apsta]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 343 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[wctemp]="343"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[wctemp]=343 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 373 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[cctemp]="373"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[cctemp]=373 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[mtfa]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[mtfa]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[hmpre]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[hmpre]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[hmmin]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[hmmin]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[tnvmcap]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[tnvmcap]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[unvmcap]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[unvmcap]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[rpmbs]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[rpmbs]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[edstt]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[edstt]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[dsto]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[dsto]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[fwug]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[fwug]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[kas]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[kas]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[hctma]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[hctma]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[mntmt]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[mntmt]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[mxtmt]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[mxtmt]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[sanicap]="0"' 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[sanicap]=0 00:09:19.552 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[hmminds]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[hmminds]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[hmmaxd]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[hmmaxd]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[nsetidmax]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[nsetidmax]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[endgidmax]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[endgidmax]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[anatt]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[anatt]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[anacap]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[anacap]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[anagrpmax]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[anagrpmax]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[nanagrpid]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[nanagrpid]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[pels]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[pels]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[domainid]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[domainid]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[megcap]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[megcap]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x66 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[sqes]="0x66"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[sqes]=0x66 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x44 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[cqes]="0x44"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[cqes]=0x44 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[maxcmd]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[maxcmd]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 256 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[nn]="256"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[nn]=256 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x15d ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[oncs]="0x15d"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[oncs]=0x15d 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[fuses]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[fuses]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[fna]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[fna]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[vwc]="0x7"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[vwc]=0x7 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[awun]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[awun]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[awupf]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[awupf]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[icsvscc]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[icsvscc]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[nwpc]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[nwpc]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[acwu]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[acwu]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[ocfs]="0x3"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[ocfs]=0x3 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[sgls]="0x1"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[sgls]=0x1 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[mnan]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[mnan]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[maxdna]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[maxdna]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[maxcna]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[maxcna]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n nqn.2019-08.org.qemu:12342 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[subnqn]="nqn.2019-08.org.qemu:12342"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[subnqn]=nqn.2019-08.org.qemu:12342 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[ioccsz]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[ioccsz]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[iorcsz]="0"' 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[iorcsz]=0 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.553 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[icdoff]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[icdoff]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[fcatt]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[fcatt]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[msdbd]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[msdbd]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[ofcs]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[ofcs]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[ps0]="mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[ps0]='mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 rwl:0 idle_power:- active_power:- ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[rwt]="0 rwl:0 idle_power:- active_power:-"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[rwt]='0 rwl:0 idle_power:- active_power:-' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n - ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2[active_power_workload]="-"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2[active_power_workload]=- 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@53 -- # local -n _ctrl_ns=nvme2_ns 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme2/ng2n1 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@56 -- # ns_dev=ng2n1 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@57 -- # nvme_get ng2n1 id-ns /dev/ng2n1 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@17 -- # local ref=ng2n1 reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@18 -- # shift 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@20 -- # local -gA 'ng2n1=()' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/ng2n1 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nsze]="0x100000"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nsze]=0x100000 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[ncap]="0x100000"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[ncap]=0x100000 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nuse]="0x100000"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nuse]=0x100000 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nsfeat]="0x14"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nsfeat]=0x14 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nlbaf]="7"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nlbaf]=7 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[flbas]="0x4"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[flbas]=0x4 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[mc]="0x3"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[mc]=0x3 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[dpc]="0x1f"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[dpc]=0x1f 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[dps]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[dps]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nmic]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nmic]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[rescap]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[rescap]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[fpi]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[fpi]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[dlfeat]="1"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[dlfeat]=1 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nawun]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nawun]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nawupf]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nawupf]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nacwu]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nacwu]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nabsn]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nabsn]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nabo]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nabo]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nabspf]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nabspf]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[noiob]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[noiob]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nvmcap]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nvmcap]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[npwg]="0"' 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[npwg]=0 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.554 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[npwa]="0"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[npwa]=0 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[npdg]="0"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[npdg]=0 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[npda]="0"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[npda]=0 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nows]="0"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nows]=0 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[mssrl]="128"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[mssrl]=128 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[mcl]="128"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[mcl]=128 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[msrc]="127"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[msrc]=127 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nulbaf]="0"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nulbaf]=0 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[anagrpid]="0"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[anagrpid]=0 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nsattr]="0"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nsattr]=0 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nvmsetid]="0"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nvmsetid]=0 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[endgid]="0"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[endgid]=0 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[nguid]="00000000000000000000000000000000"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[nguid]=00000000000000000000000000000000 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[eui64]="0000000000000000"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[eui64]=0000000000000000 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n1[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n1[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=ng2n1 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme2/ng2n2 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@56 -- # ns_dev=ng2n2 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@57 -- # nvme_get ng2n2 id-ns /dev/ng2n2 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@17 -- # local ref=ng2n2 reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@18 -- # shift 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@20 -- # local -gA 'ng2n2=()' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/ng2n2 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nsze]="0x100000"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nsze]=0x100000 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[ncap]="0x100000"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[ncap]=0x100000 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nuse]="0x100000"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nuse]=0x100000 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nsfeat]="0x14"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nsfeat]=0x14 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nlbaf]="7"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nlbaf]=7 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[flbas]="0x4"' 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[flbas]=0x4 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.555 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[mc]="0x3"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[mc]=0x3 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[dpc]="0x1f"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[dpc]=0x1f 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[dps]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[dps]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nmic]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nmic]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[rescap]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[rescap]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[fpi]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[fpi]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[dlfeat]="1"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[dlfeat]=1 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nawun]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nawun]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nawupf]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nawupf]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nacwu]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nacwu]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nabsn]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nabsn]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nabo]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nabo]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nabspf]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nabspf]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[noiob]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[noiob]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nvmcap]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nvmcap]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[npwg]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[npwg]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[npwa]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[npwa]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[npdg]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[npdg]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[npda]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[npda]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nows]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nows]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[mssrl]="128"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[mssrl]=128 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[mcl]="128"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[mcl]=128 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[msrc]="127"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[msrc]=127 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nulbaf]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nulbaf]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[anagrpid]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[anagrpid]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nsattr]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nsattr]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nvmsetid]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nvmsetid]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[endgid]="0"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[endgid]=0 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[nguid]="00000000000000000000000000000000"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[nguid]=00000000000000000000000000000000 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[eui64]="0000000000000000"' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[eui64]=0000000000000000 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.556 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n2[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n2[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=ng2n2 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme2/ng2n3 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@56 -- # ns_dev=ng2n3 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@57 -- # nvme_get ng2n3 id-ns /dev/ng2n3 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@17 -- # local ref=ng2n3 reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@18 -- # shift 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@20 -- # local -gA 'ng2n3=()' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/ng2n3 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nsze]="0x100000"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nsze]=0x100000 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[ncap]="0x100000"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[ncap]=0x100000 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nuse]="0x100000"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nuse]=0x100000 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nsfeat]="0x14"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nsfeat]=0x14 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nlbaf]="7"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nlbaf]=7 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[flbas]="0x4"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[flbas]=0x4 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[mc]="0x3"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[mc]=0x3 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[dpc]="0x1f"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[dpc]=0x1f 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[dps]="0"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[dps]=0 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nmic]="0"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nmic]=0 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[rescap]="0"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[rescap]=0 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[fpi]="0"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[fpi]=0 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[dlfeat]="1"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[dlfeat]=1 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nawun]="0"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nawun]=0 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nawupf]="0"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nawupf]=0 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nacwu]="0"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nacwu]=0 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nabsn]="0"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nabsn]=0 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nabo]="0"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nabo]=0 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nabspf]="0"' 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nabspf]=0 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.557 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[noiob]="0"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[noiob]=0 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nvmcap]="0"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nvmcap]=0 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[npwg]="0"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[npwg]=0 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[npwa]="0"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[npwa]=0 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[npdg]="0"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[npdg]=0 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[npda]="0"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[npda]=0 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nows]="0"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nows]=0 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[mssrl]="128"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[mssrl]=128 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[mcl]="128"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[mcl]=128 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[msrc]="127"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[msrc]=127 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nulbaf]="0"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nulbaf]=0 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[anagrpid]="0"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[anagrpid]=0 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nsattr]="0"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nsattr]=0 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nvmsetid]="0"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nvmsetid]=0 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[endgid]="0"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[endgid]=0 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[nguid]="00000000000000000000000000000000"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[nguid]=00000000000000000000000000000000 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[eui64]="0000000000000000"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[eui64]=0000000000000000 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'ng2n3[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # ng2n3[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=ng2n3 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme2/nvme2n1 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@56 -- # ns_dev=nvme2n1 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@57 -- # nvme_get nvme2n1 id-ns /dev/nvme2n1 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@17 -- # local ref=nvme2n1 reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@18 -- # shift 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@20 -- # local -gA 'nvme2n1=()' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/nvme2n1 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nsze]="0x100000"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nsze]=0x100000 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[ncap]="0x100000"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[ncap]=0x100000 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nuse]="0x100000"' 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nuse]=0x100000 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.558 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nsfeat]="0x14"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nsfeat]=0x14 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nlbaf]="7"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nlbaf]=7 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[flbas]="0x4"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[flbas]=0x4 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[mc]="0x3"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[mc]=0x3 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[dpc]="0x1f"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[dpc]=0x1f 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[dps]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[dps]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nmic]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nmic]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[rescap]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[rescap]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[fpi]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[fpi]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[dlfeat]="1"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[dlfeat]=1 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nawun]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nawun]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nawupf]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nawupf]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nacwu]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nacwu]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nabsn]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nabsn]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nabo]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nabo]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nabspf]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nabspf]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[noiob]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[noiob]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nvmcap]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nvmcap]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[npwg]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[npwg]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[npwa]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[npwa]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[npdg]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[npdg]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[npda]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[npda]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nows]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nows]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[mssrl]="128"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[mssrl]=128 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[mcl]="128"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[mcl]=128 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[msrc]="127"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[msrc]=127 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nulbaf]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nulbaf]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[anagrpid]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[anagrpid]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nsattr]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nsattr]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nvmsetid]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nvmsetid]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[endgid]="0"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[endgid]=0 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[nguid]="00000000000000000000000000000000"' 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[nguid]=00000000000000000000000000000000 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.559 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[eui64]="0000000000000000"' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[eui64]=0000000000000000 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n1[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n1[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=nvme2n1 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme2/nvme2n2 ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@56 -- # ns_dev=nvme2n2 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@57 -- # nvme_get nvme2n2 id-ns /dev/nvme2n2 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@17 -- # local ref=nvme2n2 reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@18 -- # shift 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@20 -- # local -gA 'nvme2n2=()' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/nvme2n2 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nsze]="0x100000"' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nsze]=0x100000 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[ncap]="0x100000"' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[ncap]=0x100000 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nuse]="0x100000"' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nuse]=0x100000 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nsfeat]="0x14"' 00:09:19.560 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nsfeat]=0x14 00:09:19.838 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.838 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.838 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:19.838 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nlbaf]="7"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nlbaf]=7 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[flbas]="0x4"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[flbas]=0x4 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[mc]="0x3"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[mc]=0x3 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[dpc]="0x1f"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[dpc]=0x1f 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[dps]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[dps]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nmic]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nmic]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[rescap]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[rescap]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[fpi]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[fpi]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[dlfeat]="1"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[dlfeat]=1 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nawun]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nawun]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nawupf]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nawupf]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nacwu]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nacwu]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nabsn]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nabsn]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nabo]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nabo]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nabspf]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nabspf]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[noiob]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[noiob]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nvmcap]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nvmcap]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[npwg]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[npwg]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[npwa]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[npwa]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[npdg]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[npdg]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[npda]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[npda]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nows]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nows]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[mssrl]="128"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[mssrl]=128 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[mcl]="128"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[mcl]=128 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[msrc]="127"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[msrc]=127 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nulbaf]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nulbaf]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[anagrpid]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[anagrpid]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nsattr]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nsattr]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nvmsetid]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nvmsetid]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[endgid]="0"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[endgid]=0 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[nguid]="00000000000000000000000000000000"' 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[nguid]=00000000000000000000000000000000 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:19.839 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[eui64]="0000000000000000"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[eui64]=0000000000000000 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n2[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n2[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=nvme2n2 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@54 -- # for ns in "$ctrl/"@("ng${ctrl##*nvme}"|"${ctrl##*/}n")* 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@55 -- # [[ -e /sys/class/nvme/nvme2/nvme2n3 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@56 -- # ns_dev=nvme2n3 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@57 -- # nvme_get nvme2n3 id-ns /dev/nvme2n3 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@17 -- # local ref=nvme2n3 reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@18 -- # shift 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@20 -- # local -gA 'nvme2n3=()' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ns /dev/nvme2n3 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nsze]="0x100000"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nsze]=0x100000 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[ncap]="0x100000"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[ncap]=0x100000 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100000 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nuse]="0x100000"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nuse]=0x100000 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x14 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nsfeat]="0x14"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nsfeat]=0x14 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nlbaf]="7"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nlbaf]=7 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x4 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[flbas]="0x4"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[flbas]=0x4 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[mc]="0x3"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[mc]=0x3 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1f ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[dpc]="0x1f"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[dpc]=0x1f 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[dps]="0"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[dps]=0 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nmic]="0"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nmic]=0 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[rescap]="0"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[rescap]=0 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[fpi]="0"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[fpi]=0 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[dlfeat]="1"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[dlfeat]=1 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nawun]="0"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nawun]=0 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nawupf]="0"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nawupf]=0 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nacwu]="0"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nacwu]=0 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nabsn]="0"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nabsn]=0 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nabo]="0"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nabo]=0 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nabspf]="0"' 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nabspf]=0 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.840 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[noiob]="0"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[noiob]=0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nvmcap]="0"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nvmcap]=0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[npwg]="0"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[npwg]=0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[npwa]="0"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[npwa]=0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[npdg]="0"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[npdg]=0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[npda]="0"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[npda]=0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nows]="0"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nows]=0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[mssrl]="128"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[mssrl]=128 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 128 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[mcl]="128"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[mcl]=128 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 127 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[msrc]="127"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[msrc]=127 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nulbaf]="0"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nulbaf]=0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[anagrpid]="0"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[anagrpid]=0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nsattr]="0"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nsattr]=0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nvmsetid]="0"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nvmsetid]=0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[endgid]="0"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[endgid]=0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 00000000000000000000000000000000 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[nguid]="00000000000000000000000000000000"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[nguid]=00000000000000000000000000000000 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0000000000000000 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[eui64]="0000000000000000"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[eui64]=0000000000000000 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:9 rp:0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf0]="ms:0 lbads:9 rp:0 "' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[lbaf0]='ms:0 lbads:9 rp:0 ' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:9 rp:0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf1]="ms:8 lbads:9 rp:0 "' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[lbaf1]='ms:8 lbads:9 rp:0 ' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:9 rp:0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf2]="ms:16 lbads:9 rp:0 "' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[lbaf2]='ms:16 lbads:9 rp:0 ' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:9 rp:0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf3]="ms:64 lbads:9 rp:0 "' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[lbaf3]='ms:64 lbads:9 rp:0 ' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:0 lbads:12 rp:0 (in use) ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf4]="ms:0 lbads:12 rp:0 (in use)"' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[lbaf4]='ms:0 lbads:12 rp:0 (in use)' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:8 lbads:12 rp:0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf5]="ms:8 lbads:12 rp:0 "' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[lbaf5]='ms:8 lbads:12 rp:0 ' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:16 lbads:12 rp:0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf6]="ms:16 lbads:12 rp:0 "' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[lbaf6]='ms:16 lbads:12 rp:0 ' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n ms:64 lbads:12 rp:0 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme2n3[lbaf7]="ms:64 lbads:12 rp:0 "' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme2n3[lbaf7]='ms:64 lbads:12 rp:0 ' 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@58 -- # _ctrl_ns[${ns##*n}]=nvme2n3 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@60 -- # ctrls["$ctrl_dev"]=nvme2 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@61 -- # nvmes["$ctrl_dev"]=nvme2_ns 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@62 -- # bdfs["$ctrl_dev"]=0000:00:12.0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@63 -- # ordered_ctrls[${ctrl_dev/nvme/}]=nvme2 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@47 -- # for ctrl in /sys/class/nvme/nvme* 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@48 -- # [[ -e /sys/class/nvme/nvme3 ]] 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@49 -- # pci=0000:00:13.0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@50 -- # pci_can_use 0000:00:13.0 00:09:19.841 04:59:39 nvme_fdp -- scripts/common.sh@18 -- # local i 00:09:19.841 04:59:39 nvme_fdp -- scripts/common.sh@21 -- # [[ =~ 0000:00:13.0 ]] 00:09:19.841 04:59:39 nvme_fdp -- scripts/common.sh@25 -- # [[ -z '' ]] 00:09:19.841 04:59:39 nvme_fdp -- scripts/common.sh@27 -- # return 0 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@51 -- # ctrl_dev=nvme3 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@52 -- # nvme_get nvme3 id-ctrl /dev/nvme3 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@17 -- # local ref=nvme3 reg val 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@18 -- # shift 00:09:19.841 04:59:39 nvme_fdp -- nvme/functions.sh@20 -- # local -gA 'nvme3=()' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@16 -- # /usr/local/src/nvme-cli/nvme id-ctrl /dev/nvme3 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n '' ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1b36 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[vid]="0x1b36"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[vid]=0x1b36 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1af4 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[ssvid]="0x1af4"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[ssvid]=0x1af4 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 12343 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[sn]="12343 "' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[sn]='12343 ' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n QEMU NVMe Ctrl ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[mn]="QEMU NVMe Ctrl "' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[mn]='QEMU NVMe Ctrl ' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 8.0.0 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[fr]="8.0.0 "' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[fr]='8.0.0 ' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 6 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[rab]="6"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[rab]=6 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 525400 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[ieee]="525400"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[ieee]=525400 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x2 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[cmic]="0x2"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[cmic]=0x2 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 7 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[mdts]="7"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[mdts]=7 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[cntlid]="0"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[cntlid]=0 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x10400 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[ver]="0x10400"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[ver]=0x10400 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[rtd3r]="0"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[rtd3r]=0 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[rtd3e]="0"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[rtd3e]=0 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x100 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[oaes]="0x100"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[oaes]=0x100 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x88010 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[ctratt]="0x88010"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[ctratt]=0x88010 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[rrls]="0"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[rrls]=0 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[cntrltype]="1"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[cntrltype]=1 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 00000000-0000-0000-0000-000000000000 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[fguid]="00000000-0000-0000-0000-000000000000"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[fguid]=00000000-0000-0000-0000-000000000000 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[crdt1]="0"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[crdt1]=0 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[crdt2]="0"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[crdt2]=0 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[crdt3]="0"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[crdt3]=0 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[nvmsr]="0"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[nvmsr]=0 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[vwci]="0"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[vwci]=0 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[mec]="0"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[mec]=0 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x12a ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[oacs]="0x12a"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[oacs]=0x12a 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[acl]="3"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[acl]=3 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 3 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[aerl]="3"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[aerl]=3 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[frmw]="0x3"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[frmw]=0x3 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[lpa]="0x7"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[lpa]=0x7 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[elpe]="0"' 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[elpe]=0 00:09:19.842 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[npss]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[npss]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[avscc]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[avscc]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[apsta]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[apsta]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 343 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[wctemp]="343"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[wctemp]=343 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 373 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[cctemp]="373"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[cctemp]=373 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[mtfa]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[mtfa]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[hmpre]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[hmpre]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[hmmin]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[hmmin]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[tnvmcap]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[tnvmcap]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[unvmcap]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[unvmcap]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[rpmbs]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[rpmbs]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[edstt]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[edstt]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[dsto]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[dsto]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[fwug]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[fwug]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[kas]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[kas]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[hctma]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[hctma]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[mntmt]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[mntmt]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[mxtmt]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[mxtmt]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[sanicap]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[sanicap]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[hmminds]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[hmminds]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[hmmaxd]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[hmmaxd]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[nsetidmax]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[nsetidmax]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 1 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[endgidmax]="1"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[endgidmax]=1 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[anatt]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[anatt]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[anacap]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[anacap]=0 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[anagrpmax]="0"' 00:09:19.843 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[anagrpmax]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[nanagrpid]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[nanagrpid]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[pels]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[pels]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[domainid]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[domainid]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[megcap]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[megcap]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x66 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[sqes]="0x66"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[sqes]=0x66 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x44 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[cqes]="0x44"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[cqes]=0x44 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[maxcmd]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[maxcmd]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 256 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[nn]="256"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[nn]=256 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x15d ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[oncs]="0x15d"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[oncs]=0x15d 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[fuses]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[fuses]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[fna]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[fna]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x7 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[vwc]="0x7"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[vwc]=0x7 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[awun]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[awun]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[awupf]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[awupf]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[icsvscc]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[icsvscc]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[nwpc]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[nwpc]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[acwu]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[acwu]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x3 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[ocfs]="0x3"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[ocfs]=0x3 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0x1 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[sgls]="0x1"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[sgls]=0x1 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[mnan]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[mnan]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[maxdna]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[maxdna]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[maxcna]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[maxcna]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n nqn.2019-08.org.qemu:fdp-subsys3 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[subnqn]="nqn.2019-08.org.qemu:fdp-subsys3"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[subnqn]=nqn.2019-08.org.qemu:fdp-subsys3 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[ioccsz]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[ioccsz]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[iorcsz]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[iorcsz]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[icdoff]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[icdoff]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[fcatt]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[fcatt]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[msdbd]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[msdbd]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[ofcs]="0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[ofcs]=0 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0 ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[ps0]="mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[ps0]='mp:25.00W operational enlat:16 exlat:4 rrt:0 rrl:0' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n 0 rwl:0 idle_power:- active_power:- ]] 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[rwt]="0 rwl:0 idle_power:- active_power:-"' 00:09:19.844 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[rwt]='0 rwl:0 idle_power:- active_power:-' 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@22 -- # [[ -n - ]] 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # eval 'nvme3[active_power_workload]="-"' 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@23 -- # nvme3[active_power_workload]=- 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # IFS=: 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@21 -- # read -r reg val 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@53 -- # local -n _ctrl_ns=nvme3_ns 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@60 -- # ctrls["$ctrl_dev"]=nvme3 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@61 -- # nvmes["$ctrl_dev"]=nvme3_ns 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@62 -- # bdfs["$ctrl_dev"]=0000:00:13.0 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@63 -- # ordered_ctrls[${ctrl_dev/nvme/}]=nvme3 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@65 -- # (( 4 > 0 )) 00:09:19.845 04:59:39 nvme_fdp -- nvme/nvme_fdp.sh@13 -- # get_ctrl_with_feature fdp 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@204 -- # local _ctrls feature=fdp 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@206 -- # _ctrls=($(get_ctrls_with_feature "$feature")) 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@206 -- # get_ctrls_with_feature fdp 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@192 -- # (( 4 == 0 )) 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@194 -- # local ctrl feature=fdp 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@196 -- # type -t ctrl_has_fdp 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@196 -- # [[ function == function ]] 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@198 -- # for ctrl in "${!ctrls[@]}" 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@199 -- # ctrl_has_fdp nvme1 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@176 -- # local ctrl=nvme1 ctratt 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@178 -- # get_ctratt nvme1 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@166 -- # local ctrl=nvme1 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@167 -- # get_nvme_ctrl_feature nvme1 ctratt 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@69 -- # local ctrl=nvme1 reg=ctratt 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@71 -- # [[ -n nvme1 ]] 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@73 -- # local -n _ctrl=nvme1 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@75 -- # [[ -n 0x8000 ]] 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@76 -- # echo 0x8000 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@178 -- # ctratt=0x8000 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@180 -- # (( ctratt & 1 << 19 )) 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@198 -- # for ctrl in "${!ctrls[@]}" 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@199 -- # ctrl_has_fdp nvme0 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@176 -- # local ctrl=nvme0 ctratt 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@178 -- # get_ctratt nvme0 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@166 -- # local ctrl=nvme0 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@167 -- # get_nvme_ctrl_feature nvme0 ctratt 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@69 -- # local ctrl=nvme0 reg=ctratt 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@71 -- # [[ -n nvme0 ]] 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@73 -- # local -n _ctrl=nvme0 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@75 -- # [[ -n 0x8000 ]] 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@76 -- # echo 0x8000 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@178 -- # ctratt=0x8000 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@180 -- # (( ctratt & 1 << 19 )) 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@198 -- # for ctrl in "${!ctrls[@]}" 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@199 -- # ctrl_has_fdp nvme3 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@176 -- # local ctrl=nvme3 ctratt 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@178 -- # get_ctratt nvme3 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@166 -- # local ctrl=nvme3 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@167 -- # get_nvme_ctrl_feature nvme3 ctratt 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@69 -- # local ctrl=nvme3 reg=ctratt 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@71 -- # [[ -n nvme3 ]] 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@73 -- # local -n _ctrl=nvme3 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@75 -- # [[ -n 0x88010 ]] 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@76 -- # echo 0x88010 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@178 -- # ctratt=0x88010 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@180 -- # (( ctratt & 1 << 19 )) 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@199 -- # echo nvme3 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@198 -- # for ctrl in "${!ctrls[@]}" 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@199 -- # ctrl_has_fdp nvme2 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@176 -- # local ctrl=nvme2 ctratt 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@178 -- # get_ctratt nvme2 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@166 -- # local ctrl=nvme2 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@167 -- # get_nvme_ctrl_feature nvme2 ctratt 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@69 -- # local ctrl=nvme2 reg=ctratt 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@71 -- # [[ -n nvme2 ]] 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@73 -- # local -n _ctrl=nvme2 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@75 -- # [[ -n 0x8000 ]] 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@76 -- # echo 0x8000 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@178 -- # ctratt=0x8000 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@180 -- # (( ctratt & 1 << 19 )) 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@207 -- # (( 1 > 0 )) 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@208 -- # echo nvme3 00:09:19.845 04:59:39 nvme_fdp -- nvme/functions.sh@209 -- # return 0 00:09:19.845 04:59:39 nvme_fdp -- nvme/nvme_fdp.sh@13 -- # ctrl=nvme3 00:09:19.845 04:59:39 nvme_fdp -- nvme/nvme_fdp.sh@14 -- # bdf=0000:00:13.0 00:09:19.845 04:59:39 nvme_fdp -- nvme/nvme_fdp.sh@16 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:09:20.120 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:09:20.691 0000:00:10.0 (1b36 0010): nvme -> uio_pci_generic 00:09:20.692 0000:00:11.0 (1b36 0010): nvme -> uio_pci_generic 00:09:20.692 0000:00:13.0 (1b36 0010): nvme -> uio_pci_generic 00:09:20.692 0000:00:12.0 (1b36 0010): nvme -> uio_pci_generic 00:09:20.692 04:59:40 nvme_fdp -- nvme/nvme_fdp.sh@18 -- # run_test nvme_flexible_data_placement /home/vagrant/spdk_repo/spdk/test/nvme/fdp/fdp -r 'trtype:pcie traddr:0000:00:13.0' 00:09:20.692 04:59:40 nvme_fdp -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:09:20.692 04:59:40 nvme_fdp -- common/autotest_common.sh@1111 -- # xtrace_disable 00:09:20.692 04:59:40 nvme_fdp -- common/autotest_common.sh@10 -- # set +x 00:09:20.692 ************************************ 00:09:20.692 START TEST nvme_flexible_data_placement 00:09:20.692 ************************************ 00:09:20.692 04:59:40 nvme_fdp.nvme_flexible_data_placement -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/fdp/fdp -r 'trtype:pcie traddr:0000:00:13.0' 00:09:20.953 Initializing NVMe Controllers 00:09:20.953 Attaching to 0000:00:13.0 00:09:20.953 Controller supports FDP Attached to 0000:00:13.0 00:09:20.953 Namespace ID: 1 Endurance Group ID: 1 00:09:20.953 Initialization complete. 00:09:20.953 00:09:20.953 ================================== 00:09:20.953 == FDP tests for Namespace: #01 == 00:09:20.953 ================================== 00:09:20.953 00:09:20.953 Get Feature: FDP: 00:09:20.953 ================= 00:09:20.953 Enabled: Yes 00:09:20.953 FDP configuration Index: 0 00:09:20.953 00:09:20.953 FDP configurations log page 00:09:20.953 =========================== 00:09:20.953 Number of FDP configurations: 1 00:09:20.953 Version: 0 00:09:20.953 Size: 112 00:09:20.953 FDP Configuration Descriptor: 0 00:09:20.953 Descriptor Size: 96 00:09:20.953 Reclaim Group Identifier format: 2 00:09:20.953 FDP Volatile Write Cache: Not Present 00:09:20.953 FDP Configuration: Valid 00:09:20.953 Vendor Specific Size: 0 00:09:20.953 Number of Reclaim Groups: 2 00:09:20.953 Number of Recalim Unit Handles: 8 00:09:20.953 Max Placement Identifiers: 128 00:09:20.953 Number of Namespaces Suppprted: 256 00:09:20.953 Reclaim unit Nominal Size: 6000000 bytes 00:09:20.953 Estimated Reclaim Unit Time Limit: Not Reported 00:09:20.953 RUH Desc #000: RUH Type: Initially Isolated 00:09:20.953 RUH Desc #001: RUH Type: Initially Isolated 00:09:20.953 RUH Desc #002: RUH Type: Initially Isolated 00:09:20.953 RUH Desc #003: RUH Type: Initially Isolated 00:09:20.953 RUH Desc #004: RUH Type: Initially Isolated 00:09:20.953 RUH Desc #005: RUH Type: Initially Isolated 00:09:20.953 RUH Desc #006: RUH Type: Initially Isolated 00:09:20.953 RUH Desc #007: RUH Type: Initially Isolated 00:09:20.953 00:09:20.953 FDP reclaim unit handle usage log page 00:09:20.953 ====================================== 00:09:20.953 Number of Reclaim Unit Handles: 8 00:09:20.953 RUH Usage Desc #000: RUH Attributes: Controller Specified 00:09:20.953 RUH Usage Desc #001: RUH Attributes: Unused 00:09:20.953 RUH Usage Desc #002: RUH Attributes: Unused 00:09:20.953 RUH Usage Desc #003: RUH Attributes: Unused 00:09:20.953 RUH Usage Desc #004: RUH Attributes: Unused 00:09:20.953 RUH Usage Desc #005: RUH Attributes: Unused 00:09:20.953 RUH Usage Desc #006: RUH Attributes: Unused 00:09:20.953 RUH Usage Desc #007: RUH Attributes: Unused 00:09:20.953 00:09:20.953 FDP statistics log page 00:09:20.953 ======================= 00:09:20.953 Host bytes with metadata written: 2077724672 00:09:20.953 Media bytes with metadata written: 2078060544 00:09:20.953 Media bytes erased: 0 00:09:20.953 00:09:20.953 FDP Reclaim unit handle status 00:09:20.953 ============================== 00:09:20.953 Number of RUHS descriptors: 2 00:09:20.953 RUHS Desc: #0000 PID: 0x0000 RUHID: 0x0000 ERUT: 0x00000000 RUAMW: 0x0000000000002287 00:09:20.953 RUHS Desc: #0001 PID: 0x4000 RUHID: 0x0000 ERUT: 0x00000000 RUAMW: 0x0000000000006000 00:09:20.953 00:09:20.953 FDP write on placement id: 0 success 00:09:20.953 00:09:20.953 Set Feature: Enabling FDP events on Placement handle: #0 Success 00:09:20.953 00:09:20.953 IO mgmt send: RUH update for Placement ID: #0 Success 00:09:20.953 00:09:20.953 Get Feature: FDP Events for Placement handle: #0 00:09:20.953 ======================== 00:09:20.953 Number of FDP Events: 6 00:09:20.953 FDP Event: #0 Type: RU Not Written to Capacity Enabled: Yes 00:09:20.953 FDP Event: #1 Type: RU Time Limit Exceeded Enabled: Yes 00:09:20.953 FDP Event: #2 Type: Ctrlr Reset Modified RUH's Enabled: Yes 00:09:20.953 FDP Event: #3 Type: Invalid Placement Identifier Enabled: Yes 00:09:20.953 FDP Event: #4 Type: Media Reallocated Enabled: No 00:09:20.953 FDP Event: #5 Type: Implicitly modified RUH Enabled: No 00:09:20.953 00:09:20.953 FDP events log page 00:09:20.953 =================== 00:09:20.953 Number of FDP events: 1 00:09:20.953 FDP Event #0: 00:09:20.954 Event Type: RU Not Written to Capacity 00:09:20.954 Placement Identifier: Valid 00:09:20.954 NSID: Valid 00:09:20.954 Location: Valid 00:09:20.954 Placement Identifier: 0 00:09:20.954 Event Timestamp: 2 00:09:20.954 Namespace Identifier: 1 00:09:20.954 Reclaim Group Identifier: 0 00:09:20.954 Reclaim Unit Handle Identifier: 0 00:09:20.954 00:09:20.954 FDP test passed 00:09:20.954 ************************************ 00:09:20.954 END TEST nvme_flexible_data_placement 00:09:20.954 ************************************ 00:09:20.954 00:09:20.954 real 0m0.221s 00:09:20.954 user 0m0.064s 00:09:20.954 sys 0m0.056s 00:09:20.954 04:59:40 nvme_fdp.nvme_flexible_data_placement -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:20.954 04:59:40 nvme_fdp.nvme_flexible_data_placement -- common/autotest_common.sh@10 -- # set +x 00:09:20.954 ************************************ 00:09:20.954 END TEST nvme_fdp 00:09:20.954 ************************************ 00:09:20.954 00:09:20.954 real 0m7.595s 00:09:20.954 user 0m1.162s 00:09:20.954 sys 0m1.286s 00:09:20.954 04:59:41 nvme_fdp -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:20.954 04:59:41 nvme_fdp -- common/autotest_common.sh@10 -- # set +x 00:09:20.954 04:59:41 -- spdk/autotest.sh@232 -- # [[ '' -eq 1 ]] 00:09:20.954 04:59:41 -- spdk/autotest.sh@236 -- # run_test nvme_rpc /home/vagrant/spdk_repo/spdk/test/nvme/nvme_rpc.sh 00:09:20.954 04:59:41 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:09:20.954 04:59:41 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:09:20.954 04:59:41 -- common/autotest_common.sh@10 -- # set +x 00:09:20.954 ************************************ 00:09:20.954 START TEST nvme_rpc 00:09:20.954 ************************************ 00:09:20.954 04:59:41 nvme_rpc -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/nvme_rpc.sh 00:09:21.215 * Looking for test storage... 00:09:21.215 * Found test storage at /home/vagrant/spdk_repo/spdk/test/nvme 00:09:21.215 04:59:41 nvme_rpc -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:09:21.215 04:59:41 nvme_rpc -- common/autotest_common.sh@1711 -- # lcov --version 00:09:21.215 04:59:41 nvme_rpc -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:09:21.215 04:59:41 nvme_rpc -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@336 -- # IFS=.-: 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@336 -- # read -ra ver1 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@337 -- # IFS=.-: 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@337 -- # read -ra ver2 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@338 -- # local 'op=<' 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@340 -- # ver1_l=2 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@341 -- # ver2_l=1 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@344 -- # case "$op" in 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@345 -- # : 1 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@365 -- # decimal 1 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@353 -- # local d=1 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@355 -- # echo 1 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@365 -- # ver1[v]=1 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@366 -- # decimal 2 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@353 -- # local d=2 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@355 -- # echo 2 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@366 -- # ver2[v]=2 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:21.215 04:59:41 nvme_rpc -- scripts/common.sh@368 -- # return 0 00:09:21.215 04:59:41 nvme_rpc -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:21.215 04:59:41 nvme_rpc -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:09:21.215 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:21.215 --rc genhtml_branch_coverage=1 00:09:21.215 --rc genhtml_function_coverage=1 00:09:21.215 --rc genhtml_legend=1 00:09:21.215 --rc geninfo_all_blocks=1 00:09:21.215 --rc geninfo_unexecuted_blocks=1 00:09:21.215 00:09:21.215 ' 00:09:21.215 04:59:41 nvme_rpc -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:09:21.215 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:21.215 --rc genhtml_branch_coverage=1 00:09:21.215 --rc genhtml_function_coverage=1 00:09:21.215 --rc genhtml_legend=1 00:09:21.215 --rc geninfo_all_blocks=1 00:09:21.215 --rc geninfo_unexecuted_blocks=1 00:09:21.215 00:09:21.215 ' 00:09:21.215 04:59:41 nvme_rpc -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:09:21.215 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:21.215 --rc genhtml_branch_coverage=1 00:09:21.215 --rc genhtml_function_coverage=1 00:09:21.215 --rc genhtml_legend=1 00:09:21.215 --rc geninfo_all_blocks=1 00:09:21.215 --rc geninfo_unexecuted_blocks=1 00:09:21.215 00:09:21.215 ' 00:09:21.215 04:59:41 nvme_rpc -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:09:21.215 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:21.215 --rc genhtml_branch_coverage=1 00:09:21.215 --rc genhtml_function_coverage=1 00:09:21.215 --rc genhtml_legend=1 00:09:21.215 --rc geninfo_all_blocks=1 00:09:21.215 --rc geninfo_unexecuted_blocks=1 00:09:21.215 00:09:21.216 ' 00:09:21.216 04:59:41 nvme_rpc -- nvme/nvme_rpc.sh@11 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:09:21.216 04:59:41 nvme_rpc -- nvme/nvme_rpc.sh@13 -- # get_first_nvme_bdf 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@1509 -- # bdfs=() 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@1509 -- # local bdfs 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@1510 -- # bdfs=($(get_nvme_bdfs)) 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@1510 -- # get_nvme_bdfs 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@1498 -- # bdfs=() 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@1498 -- # local bdfs 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@1499 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@1499 -- # /home/vagrant/spdk_repo/spdk/scripts/gen_nvme.sh 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@1499 -- # jq -r '.config[].params.traddr' 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@1500 -- # (( 4 == 0 )) 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@1504 -- # printf '%s\n' 0000:00:10.0 0000:00:11.0 0000:00:12.0 0000:00:13.0 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@1512 -- # echo 0000:00:10.0 00:09:21.216 04:59:41 nvme_rpc -- nvme/nvme_rpc.sh@13 -- # bdf=0000:00:10.0 00:09:21.216 04:59:41 nvme_rpc -- nvme/nvme_rpc.sh@16 -- # spdk_tgt_pid=79354 00:09:21.216 04:59:41 nvme_rpc -- nvme/nvme_rpc.sh@15 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x3 00:09:21.216 04:59:41 nvme_rpc -- nvme/nvme_rpc.sh@17 -- # trap 'kill -9 ${spdk_tgt_pid}; exit 1' SIGINT SIGTERM EXIT 00:09:21.216 04:59:41 nvme_rpc -- nvme/nvme_rpc.sh@19 -- # waitforlisten 79354 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@835 -- # '[' -z 79354 ']' 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:21.216 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:09:21.216 04:59:41 nvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:21.216 [2024-12-15 04:59:41.298812] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:09:21.216 [2024-12-15 04:59:41.298924] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid79354 ] 00:09:21.477 [2024-12-15 04:59:41.457673] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:21.477 [2024-12-15 04:59:41.476826] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:09:21.477 [2024-12-15 04:59:41.476907] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:09:22.049 04:59:42 nvme_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:09:22.049 04:59:42 nvme_rpc -- common/autotest_common.sh@868 -- # return 0 00:09:22.049 04:59:42 nvme_rpc -- nvme/nvme_rpc.sh@21 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b Nvme0 -t PCIe -a 0000:00:10.0 00:09:22.310 Nvme0n1 00:09:22.310 04:59:42 nvme_rpc -- nvme/nvme_rpc.sh@27 -- # '[' -f non_existing_file ']' 00:09:22.310 04:59:42 nvme_rpc -- nvme/nvme_rpc.sh@32 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_apply_firmware non_existing_file Nvme0n1 00:09:22.572 request: 00:09:22.572 { 00:09:22.572 "bdev_name": "Nvme0n1", 00:09:22.572 "filename": "non_existing_file", 00:09:22.572 "method": "bdev_nvme_apply_firmware", 00:09:22.572 "req_id": 1 00:09:22.572 } 00:09:22.572 Got JSON-RPC error response 00:09:22.572 response: 00:09:22.572 { 00:09:22.572 "code": -32603, 00:09:22.572 "message": "open file failed." 00:09:22.572 } 00:09:22.572 04:59:42 nvme_rpc -- nvme/nvme_rpc.sh@32 -- # rv=1 00:09:22.572 04:59:42 nvme_rpc -- nvme/nvme_rpc.sh@33 -- # '[' -z 1 ']' 00:09:22.572 04:59:42 nvme_rpc -- nvme/nvme_rpc.sh@37 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_detach_controller Nvme0 00:09:22.832 04:59:42 nvme_rpc -- nvme/nvme_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:09:22.832 04:59:42 nvme_rpc -- nvme/nvme_rpc.sh@40 -- # killprocess 79354 00:09:22.832 04:59:42 nvme_rpc -- common/autotest_common.sh@954 -- # '[' -z 79354 ']' 00:09:22.832 04:59:42 nvme_rpc -- common/autotest_common.sh@958 -- # kill -0 79354 00:09:22.832 04:59:42 nvme_rpc -- common/autotest_common.sh@959 -- # uname 00:09:22.832 04:59:42 nvme_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:09:22.832 04:59:42 nvme_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 79354 00:09:22.832 04:59:42 nvme_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:09:22.832 04:59:42 nvme_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:09:22.832 killing process with pid 79354 00:09:22.832 04:59:42 nvme_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 79354' 00:09:22.832 04:59:42 nvme_rpc -- common/autotest_common.sh@973 -- # kill 79354 00:09:22.832 04:59:42 nvme_rpc -- common/autotest_common.sh@978 -- # wait 79354 00:09:23.093 00:09:23.093 real 0m2.032s 00:09:23.093 user 0m4.009s 00:09:23.093 sys 0m0.445s 00:09:23.093 04:59:43 nvme_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:23.093 ************************************ 00:09:23.093 END TEST nvme_rpc 00:09:23.093 ************************************ 00:09:23.093 04:59:43 nvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:23.093 04:59:43 -- spdk/autotest.sh@237 -- # run_test nvme_rpc_timeouts /home/vagrant/spdk_repo/spdk/test/nvme/nvme_rpc_timeouts.sh 00:09:23.093 04:59:43 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:09:23.093 04:59:43 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:09:23.093 04:59:43 -- common/autotest_common.sh@10 -- # set +x 00:09:23.093 ************************************ 00:09:23.093 START TEST nvme_rpc_timeouts 00:09:23.093 ************************************ 00:09:23.093 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/nvme_rpc_timeouts.sh 00:09:23.093 * Looking for test storage... 00:09:23.093 * Found test storage at /home/vagrant/spdk_repo/spdk/test/nvme 00:09:23.093 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:09:23.093 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:09:23.093 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@1711 -- # lcov --version 00:09:23.093 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:09:23.093 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:23.093 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:23.093 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:23.093 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@336 -- # IFS=.-: 00:09:23.093 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@336 -- # read -ra ver1 00:09:23.093 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@337 -- # IFS=.-: 00:09:23.093 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@337 -- # read -ra ver2 00:09:23.093 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@338 -- # local 'op=<' 00:09:23.093 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@340 -- # ver1_l=2 00:09:23.094 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@341 -- # ver2_l=1 00:09:23.094 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:23.094 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@344 -- # case "$op" in 00:09:23.094 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@345 -- # : 1 00:09:23.094 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:23.094 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:23.094 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@365 -- # decimal 1 00:09:23.094 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@353 -- # local d=1 00:09:23.094 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:23.094 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@355 -- # echo 1 00:09:23.094 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@365 -- # ver1[v]=1 00:09:23.094 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@366 -- # decimal 2 00:09:23.355 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@353 -- # local d=2 00:09:23.355 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:23.355 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@355 -- # echo 2 00:09:23.355 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@366 -- # ver2[v]=2 00:09:23.355 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:23.355 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:23.355 04:59:43 nvme_rpc_timeouts -- scripts/common.sh@368 -- # return 0 00:09:23.355 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:23.355 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:09:23.355 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:23.355 --rc genhtml_branch_coverage=1 00:09:23.355 --rc genhtml_function_coverage=1 00:09:23.355 --rc genhtml_legend=1 00:09:23.355 --rc geninfo_all_blocks=1 00:09:23.355 --rc geninfo_unexecuted_blocks=1 00:09:23.355 00:09:23.355 ' 00:09:23.355 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:09:23.355 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:23.355 --rc genhtml_branch_coverage=1 00:09:23.355 --rc genhtml_function_coverage=1 00:09:23.355 --rc genhtml_legend=1 00:09:23.355 --rc geninfo_all_blocks=1 00:09:23.355 --rc geninfo_unexecuted_blocks=1 00:09:23.355 00:09:23.355 ' 00:09:23.355 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:09:23.355 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:23.355 --rc genhtml_branch_coverage=1 00:09:23.355 --rc genhtml_function_coverage=1 00:09:23.355 --rc genhtml_legend=1 00:09:23.355 --rc geninfo_all_blocks=1 00:09:23.355 --rc geninfo_unexecuted_blocks=1 00:09:23.355 00:09:23.355 ' 00:09:23.355 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:09:23.355 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:23.355 --rc genhtml_branch_coverage=1 00:09:23.355 --rc genhtml_function_coverage=1 00:09:23.355 --rc genhtml_legend=1 00:09:23.355 --rc geninfo_all_blocks=1 00:09:23.355 --rc geninfo_unexecuted_blocks=1 00:09:23.355 00:09:23.355 ' 00:09:23.355 04:59:43 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@19 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:09:23.355 04:59:43 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@21 -- # tmpfile_default_settings=/tmp/settings_default_79408 00:09:23.355 04:59:43 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@22 -- # tmpfile_modified_settings=/tmp/settings_modified_79408 00:09:23.355 04:59:43 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@25 -- # spdk_tgt_pid=79440 00:09:23.355 04:59:43 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@26 -- # trap 'kill -9 ${spdk_tgt_pid}; rm -f ${tmpfile_default_settings} ${tmpfile_modified_settings} ; exit 1' SIGINT SIGTERM EXIT 00:09:23.355 04:59:43 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@27 -- # waitforlisten 79440 00:09:23.355 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@835 -- # '[' -z 79440 ']' 00:09:23.355 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:23.355 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@840 -- # local max_retries=100 00:09:23.355 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:23.355 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:23.355 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@844 -- # xtrace_disable 00:09:23.355 04:59:43 nvme_rpc_timeouts -- common/autotest_common.sh@10 -- # set +x 00:09:23.355 04:59:43 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@24 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x3 00:09:23.355 [2024-12-15 04:59:43.305707] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:09:23.355 [2024-12-15 04:59:43.305826] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid79440 ] 00:09:23.355 [2024-12-15 04:59:43.459201] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:23.355 [2024-12-15 04:59:43.478328] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:09:23.355 [2024-12-15 04:59:43.478366] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:09:24.298 04:59:44 nvme_rpc_timeouts -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:09:24.298 04:59:44 nvme_rpc_timeouts -- common/autotest_common.sh@868 -- # return 0 00:09:24.298 Checking default timeout settings: 00:09:24.298 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@29 -- # echo Checking default timeout settings: 00:09:24.298 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@30 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py save_config 00:09:24.559 Making settings changes with rpc: 00:09:24.559 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@32 -- # echo Making settings changes with rpc: 00:09:24.559 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@34 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_set_options --timeout-us=12000000 --timeout-admin-us=24000000 --action-on-timeout=abort 00:09:24.559 Check default vs. modified settings: 00:09:24.559 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@36 -- # echo Check default vs. modified settings: 00:09:24.559 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@37 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py save_config 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@38 -- # settings_to_check='action_on_timeout timeout_us timeout_admin_us' 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@39 -- # for setting in $settings_to_check 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@40 -- # grep action_on_timeout /tmp/settings_default_79408 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@40 -- # awk '{print $2}' 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@40 -- # sed 's/[^a-zA-Z0-9]//g' 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@40 -- # setting_before=none 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@41 -- # grep action_on_timeout /tmp/settings_modified_79408 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@41 -- # awk '{print $2}' 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@41 -- # sed 's/[^a-zA-Z0-9]//g' 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@41 -- # setting_modified=abort 00:09:25.134 Setting action_on_timeout is changed as expected. 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@42 -- # '[' none == abort ']' 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@47 -- # echo Setting action_on_timeout is changed as expected. 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@39 -- # for setting in $settings_to_check 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@40 -- # grep timeout_us /tmp/settings_default_79408 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@40 -- # sed 's/[^a-zA-Z0-9]//g' 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@40 -- # awk '{print $2}' 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@40 -- # setting_before=0 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@41 -- # grep timeout_us /tmp/settings_modified_79408 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@41 -- # awk '{print $2}' 00:09:25.134 04:59:44 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@41 -- # sed 's/[^a-zA-Z0-9]//g' 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@41 -- # setting_modified=12000000 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@42 -- # '[' 0 == 12000000 ']' 00:09:25.134 Setting timeout_us is changed as expected. 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@47 -- # echo Setting timeout_us is changed as expected. 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@39 -- # for setting in $settings_to_check 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@40 -- # grep timeout_admin_us /tmp/settings_default_79408 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@40 -- # awk '{print $2}' 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@40 -- # sed 's/[^a-zA-Z0-9]//g' 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@40 -- # setting_before=0 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@41 -- # grep timeout_admin_us /tmp/settings_modified_79408 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@41 -- # awk '{print $2}' 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@41 -- # sed 's/[^a-zA-Z0-9]//g' 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@41 -- # setting_modified=24000000 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@42 -- # '[' 0 == 24000000 ']' 00:09:25.134 Setting timeout_admin_us is changed as expected. 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@47 -- # echo Setting timeout_admin_us is changed as expected. 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@52 -- # trap - SIGINT SIGTERM EXIT 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@53 -- # rm -f /tmp/settings_default_79408 /tmp/settings_modified_79408 00:09:25.134 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@54 -- # killprocess 79440 00:09:25.134 04:59:45 nvme_rpc_timeouts -- common/autotest_common.sh@954 -- # '[' -z 79440 ']' 00:09:25.134 04:59:45 nvme_rpc_timeouts -- common/autotest_common.sh@958 -- # kill -0 79440 00:09:25.134 04:59:45 nvme_rpc_timeouts -- common/autotest_common.sh@959 -- # uname 00:09:25.134 04:59:45 nvme_rpc_timeouts -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:09:25.134 04:59:45 nvme_rpc_timeouts -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 79440 00:09:25.134 04:59:45 nvme_rpc_timeouts -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:09:25.134 04:59:45 nvme_rpc_timeouts -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:09:25.134 killing process with pid 79440 00:09:25.134 04:59:45 nvme_rpc_timeouts -- common/autotest_common.sh@972 -- # echo 'killing process with pid 79440' 00:09:25.134 04:59:45 nvme_rpc_timeouts -- common/autotest_common.sh@973 -- # kill 79440 00:09:25.134 04:59:45 nvme_rpc_timeouts -- common/autotest_common.sh@978 -- # wait 79440 00:09:25.394 RPC TIMEOUT SETTING TEST PASSED. 00:09:25.394 04:59:45 nvme_rpc_timeouts -- nvme/nvme_rpc_timeouts.sh@56 -- # echo RPC TIMEOUT SETTING TEST PASSED. 00:09:25.394 00:09:25.394 real 0m2.177s 00:09:25.394 user 0m4.444s 00:09:25.394 sys 0m0.419s 00:09:25.394 04:59:45 nvme_rpc_timeouts -- common/autotest_common.sh@1130 -- # xtrace_disable 00:09:25.394 04:59:45 nvme_rpc_timeouts -- common/autotest_common.sh@10 -- # set +x 00:09:25.394 ************************************ 00:09:25.394 END TEST nvme_rpc_timeouts 00:09:25.394 ************************************ 00:09:25.394 04:59:45 -- spdk/autotest.sh@239 -- # uname -s 00:09:25.394 04:59:45 -- spdk/autotest.sh@239 -- # '[' Linux = Linux ']' 00:09:25.394 04:59:45 -- spdk/autotest.sh@240 -- # run_test sw_hotplug /home/vagrant/spdk_repo/spdk/test/nvme/sw_hotplug.sh 00:09:25.394 04:59:45 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:09:25.394 04:59:45 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:09:25.394 04:59:45 -- common/autotest_common.sh@10 -- # set +x 00:09:25.394 ************************************ 00:09:25.394 START TEST sw_hotplug 00:09:25.394 ************************************ 00:09:25.394 04:59:45 sw_hotplug -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/sw_hotplug.sh 00:09:25.394 * Looking for test storage... 00:09:25.394 * Found test storage at /home/vagrant/spdk_repo/spdk/test/nvme 00:09:25.394 04:59:45 sw_hotplug -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:09:25.394 04:59:45 sw_hotplug -- common/autotest_common.sh@1711 -- # lcov --version 00:09:25.394 04:59:45 sw_hotplug -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:09:25.394 04:59:45 sw_hotplug -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:09:25.394 04:59:45 sw_hotplug -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:09:25.394 04:59:45 sw_hotplug -- scripts/common.sh@333 -- # local ver1 ver1_l 00:09:25.394 04:59:45 sw_hotplug -- scripts/common.sh@334 -- # local ver2 ver2_l 00:09:25.394 04:59:45 sw_hotplug -- scripts/common.sh@336 -- # IFS=.-: 00:09:25.394 04:59:45 sw_hotplug -- scripts/common.sh@336 -- # read -ra ver1 00:09:25.394 04:59:45 sw_hotplug -- scripts/common.sh@337 -- # IFS=.-: 00:09:25.394 04:59:45 sw_hotplug -- scripts/common.sh@337 -- # read -ra ver2 00:09:25.394 04:59:45 sw_hotplug -- scripts/common.sh@338 -- # local 'op=<' 00:09:25.394 04:59:45 sw_hotplug -- scripts/common.sh@340 -- # ver1_l=2 00:09:25.394 04:59:45 sw_hotplug -- scripts/common.sh@341 -- # ver2_l=1 00:09:25.394 04:59:45 sw_hotplug -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:09:25.394 04:59:45 sw_hotplug -- scripts/common.sh@344 -- # case "$op" in 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@345 -- # : 1 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@364 -- # (( v = 0 )) 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@365 -- # decimal 1 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@353 -- # local d=1 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@355 -- # echo 1 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@365 -- # ver1[v]=1 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@366 -- # decimal 2 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@353 -- # local d=2 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@355 -- # echo 2 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@366 -- # ver2[v]=2 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:09:25.395 04:59:45 sw_hotplug -- scripts/common.sh@368 -- # return 0 00:09:25.395 04:59:45 sw_hotplug -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:09:25.395 04:59:45 sw_hotplug -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:09:25.395 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:25.395 --rc genhtml_branch_coverage=1 00:09:25.395 --rc genhtml_function_coverage=1 00:09:25.395 --rc genhtml_legend=1 00:09:25.395 --rc geninfo_all_blocks=1 00:09:25.395 --rc geninfo_unexecuted_blocks=1 00:09:25.395 00:09:25.395 ' 00:09:25.395 04:59:45 sw_hotplug -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:09:25.395 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:25.395 --rc genhtml_branch_coverage=1 00:09:25.395 --rc genhtml_function_coverage=1 00:09:25.395 --rc genhtml_legend=1 00:09:25.395 --rc geninfo_all_blocks=1 00:09:25.395 --rc geninfo_unexecuted_blocks=1 00:09:25.395 00:09:25.395 ' 00:09:25.395 04:59:45 sw_hotplug -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:09:25.395 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:25.395 --rc genhtml_branch_coverage=1 00:09:25.395 --rc genhtml_function_coverage=1 00:09:25.395 --rc genhtml_legend=1 00:09:25.395 --rc geninfo_all_blocks=1 00:09:25.395 --rc geninfo_unexecuted_blocks=1 00:09:25.395 00:09:25.395 ' 00:09:25.395 04:59:45 sw_hotplug -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:09:25.395 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:09:25.395 --rc genhtml_branch_coverage=1 00:09:25.395 --rc genhtml_function_coverage=1 00:09:25.395 --rc genhtml_legend=1 00:09:25.395 --rc geninfo_all_blocks=1 00:09:25.395 --rc geninfo_unexecuted_blocks=1 00:09:25.395 00:09:25.395 ' 00:09:25.395 04:59:45 sw_hotplug -- nvme/sw_hotplug.sh@129 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:09:25.967 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:09:25.967 0000:00:11.0 (1b36 0010): Already using the uio_pci_generic driver 00:09:25.967 0000:00:10.0 (1b36 0010): Already using the uio_pci_generic driver 00:09:25.967 0000:00:12.0 (1b36 0010): Already using the uio_pci_generic driver 00:09:25.967 0000:00:13.0 (1b36 0010): Already using the uio_pci_generic driver 00:09:25.967 04:59:46 sw_hotplug -- nvme/sw_hotplug.sh@131 -- # hotplug_wait=6 00:09:25.967 04:59:46 sw_hotplug -- nvme/sw_hotplug.sh@132 -- # hotplug_events=3 00:09:25.967 04:59:46 sw_hotplug -- nvme/sw_hotplug.sh@133 -- # nvmes=($(nvme_in_userspace)) 00:09:25.967 04:59:46 sw_hotplug -- nvme/sw_hotplug.sh@133 -- # nvme_in_userspace 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@312 -- # local bdf bdfs 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@313 -- # local nvmes 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@315 -- # [[ -n '' ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@318 -- # nvmes=($(iter_pci_class_code 01 08 02)) 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@318 -- # iter_pci_class_code 01 08 02 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@298 -- # local bdf= 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@300 -- # iter_all_pci_class_code 01 08 02 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@233 -- # local class 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@234 -- # local subclass 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@235 -- # local progif 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@236 -- # printf %02x 1 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@236 -- # class=01 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@237 -- # printf %02x 8 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@237 -- # subclass=08 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@238 -- # printf %02x 2 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@238 -- # progif=02 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@240 -- # hash lspci 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@241 -- # '[' 02 '!=' 00 ']' 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@242 -- # lspci -mm -n -D 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@243 -- # grep -i -- -p02 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@244 -- # awk -v 'cc="0108"' -F ' ' '{if (cc ~ $2) print $1}' 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@245 -- # tr -d '"' 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@300 -- # for bdf in $(iter_all_pci_class_code "$@") 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@301 -- # pci_can_use 0000:00:10.0 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@18 -- # local i 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@21 -- # [[ =~ 0000:00:10.0 ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@25 -- # [[ -z '' ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@27 -- # return 0 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@302 -- # echo 0000:00:10.0 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@300 -- # for bdf in $(iter_all_pci_class_code "$@") 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@301 -- # pci_can_use 0000:00:11.0 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@18 -- # local i 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@21 -- # [[ =~ 0000:00:11.0 ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@25 -- # [[ -z '' ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@27 -- # return 0 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@302 -- # echo 0000:00:11.0 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@300 -- # for bdf in $(iter_all_pci_class_code "$@") 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@301 -- # pci_can_use 0000:00:12.0 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@18 -- # local i 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@21 -- # [[ =~ 0000:00:12.0 ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@25 -- # [[ -z '' ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@27 -- # return 0 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@302 -- # echo 0000:00:12.0 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@300 -- # for bdf in $(iter_all_pci_class_code "$@") 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@301 -- # pci_can_use 0000:00:13.0 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@18 -- # local i 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@21 -- # [[ =~ 0000:00:13.0 ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@25 -- # [[ -z '' ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@27 -- # return 0 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@302 -- # echo 0000:00:13.0 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@321 -- # for bdf in "${nvmes[@]}" 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@322 -- # [[ -e /sys/bus/pci/drivers/nvme/0000:00:10.0 ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@323 -- # uname -s 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@323 -- # [[ Linux == FreeBSD ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@326 -- # bdfs+=("$bdf") 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@321 -- # for bdf in "${nvmes[@]}" 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@322 -- # [[ -e /sys/bus/pci/drivers/nvme/0000:00:11.0 ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@323 -- # uname -s 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@323 -- # [[ Linux == FreeBSD ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@326 -- # bdfs+=("$bdf") 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@321 -- # for bdf in "${nvmes[@]}" 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@322 -- # [[ -e /sys/bus/pci/drivers/nvme/0000:00:12.0 ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@323 -- # uname -s 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@323 -- # [[ Linux == FreeBSD ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@326 -- # bdfs+=("$bdf") 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@321 -- # for bdf in "${nvmes[@]}" 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@322 -- # [[ -e /sys/bus/pci/drivers/nvme/0000:00:13.0 ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@323 -- # uname -s 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@323 -- # [[ Linux == FreeBSD ]] 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@326 -- # bdfs+=("$bdf") 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@328 -- # (( 4 )) 00:09:25.967 04:59:46 sw_hotplug -- scripts/common.sh@329 -- # printf '%s\n' 0000:00:10.0 0000:00:11.0 0000:00:12.0 0000:00:13.0 00:09:25.967 04:59:46 sw_hotplug -- nvme/sw_hotplug.sh@134 -- # nvme_count=2 00:09:25.967 04:59:46 sw_hotplug -- nvme/sw_hotplug.sh@135 -- # nvmes=("${nvmes[@]::nvme_count}") 00:09:25.967 04:59:46 sw_hotplug -- nvme/sw_hotplug.sh@138 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:09:26.538 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:09:26.538 Waiting for block devices as requested 00:09:26.538 0000:00:11.0 (1b36 0010): uio_pci_generic -> nvme 00:09:26.538 0000:00:10.0 (1b36 0010): uio_pci_generic -> nvme 00:09:26.799 0000:00:12.0 (1b36 0010): uio_pci_generic -> nvme 00:09:26.800 0000:00:13.0 (1b36 0010): uio_pci_generic -> nvme 00:09:32.118 * Events for some block/disk devices (0000:00:13.0) were not caught, they may be missing 00:09:32.118 04:59:51 sw_hotplug -- nvme/sw_hotplug.sh@140 -- # PCI_ALLOWED='0000:00:10.0 0000:00:11.0' 00:09:32.118 04:59:51 sw_hotplug -- nvme/sw_hotplug.sh@140 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:09:32.376 0000:00:03.0 (1af4 1001): Skipping denied controller at 0000:00:03.0 00:09:32.376 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:09:32.376 0000:00:12.0 (1b36 0010): Skipping denied controller at 0000:00:12.0 00:09:32.634 0000:00:13.0 (1b36 0010): Skipping denied controller at 0000:00:13.0 00:09:32.892 0000:00:11.0 (1b36 0010): nvme -> uio_pci_generic 00:09:32.892 0000:00:10.0 (1b36 0010): nvme -> uio_pci_generic 00:09:32.892 04:59:52 sw_hotplug -- nvme/sw_hotplug.sh@143 -- # xtrace_disable 00:09:32.892 04:59:52 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:09:32.892 04:59:53 sw_hotplug -- nvme/sw_hotplug.sh@148 -- # run_hotplug 00:09:32.892 04:59:53 sw_hotplug -- nvme/sw_hotplug.sh@77 -- # trap 'killprocess $hotplug_pid; exit 1' SIGINT SIGTERM EXIT 00:09:32.892 04:59:53 sw_hotplug -- nvme/sw_hotplug.sh@85 -- # hotplug_pid=80284 00:09:32.892 04:59:53 sw_hotplug -- nvme/sw_hotplug.sh@87 -- # debug_remove_attach_helper 3 6 false 00:09:32.892 04:59:53 sw_hotplug -- nvme/sw_hotplug.sh@19 -- # local helper_time=0 00:09:33.150 04:59:53 sw_hotplug -- nvme/sw_hotplug.sh@80 -- # /home/vagrant/spdk_repo/spdk/build/examples/hotplug -i 0 -t 0 -n 6 -r 6 -l warning 00:09:33.150 04:59:53 sw_hotplug -- nvme/sw_hotplug.sh@21 -- # timing_cmd remove_attach_helper 3 6 false 00:09:33.150 04:59:53 sw_hotplug -- common/autotest_common.sh@709 -- # local cmd_es=0 00:09:33.150 04:59:53 sw_hotplug -- common/autotest_common.sh@711 -- # [[ -t 0 ]] 00:09:33.150 04:59:53 sw_hotplug -- common/autotest_common.sh@711 -- # exec 00:09:33.150 04:59:53 sw_hotplug -- common/autotest_common.sh@713 -- # local time=0 TIMEFORMAT=%2R 00:09:33.150 04:59:53 sw_hotplug -- common/autotest_common.sh@719 -- # remove_attach_helper 3 6 false 00:09:33.150 04:59:53 sw_hotplug -- nvme/sw_hotplug.sh@27 -- # local hotplug_events=3 00:09:33.150 04:59:53 sw_hotplug -- nvme/sw_hotplug.sh@28 -- # local hotplug_wait=6 00:09:33.150 04:59:53 sw_hotplug -- nvme/sw_hotplug.sh@29 -- # local use_bdev=false 00:09:33.150 04:59:53 sw_hotplug -- nvme/sw_hotplug.sh@30 -- # local dev bdfs 00:09:33.150 04:59:53 sw_hotplug -- nvme/sw_hotplug.sh@36 -- # sleep 6 00:09:33.150 Initializing NVMe Controllers 00:09:33.150 Attaching to 0000:00:10.0 00:09:33.150 Attaching to 0000:00:11.0 00:09:33.150 Attached to 0000:00:11.0 00:09:33.150 Attached to 0000:00:10.0 00:09:33.150 Initialization complete. Starting I/O... 00:09:33.150 QEMU NVMe Ctrl (12341 ): 0 I/Os completed (+0) 00:09:33.150 QEMU NVMe Ctrl (12340 ): 0 I/Os completed (+0) 00:09:33.150 00:09:34.522 QEMU NVMe Ctrl (12341 ): 3004 I/Os completed (+3004) 00:09:34.522 QEMU NVMe Ctrl (12340 ): 3008 I/Os completed (+3008) 00:09:34.522 00:09:35.455 QEMU NVMe Ctrl (12341 ): 7081 I/Os completed (+4077) 00:09:35.455 QEMU NVMe Ctrl (12340 ): 6974 I/Os completed (+3966) 00:09:35.455 00:09:36.389 QEMU NVMe Ctrl (12341 ): 10801 I/Os completed (+3720) 00:09:36.389 QEMU NVMe Ctrl (12340 ): 10623 I/Os completed (+3649) 00:09:36.389 00:09:37.353 QEMU NVMe Ctrl (12341 ): 15116 I/Os completed (+4315) 00:09:37.353 QEMU NVMe Ctrl (12340 ): 14908 I/Os completed (+4285) 00:09:37.353 00:09:38.299 QEMU NVMe Ctrl (12341 ): 18229 I/Os completed (+3113) 00:09:38.299 QEMU NVMe Ctrl (12340 ): 18059 I/Os completed (+3151) 00:09:38.299 00:09:39.242 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@38 -- # (( hotplug_events-- )) 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:09:39.243 [2024-12-15 04:59:59.041104] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:10.0, 0] in failed state. 00:09:39.243 Controller removed: QEMU NVMe Ctrl (12340 ) 00:09:39.243 [2024-12-15 04:59:59.042575] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 [2024-12-15 04:59:59.042654] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 [2024-12-15 04:59:59.042672] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 [2024-12-15 04:59:59.042693] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 unregister_dev: QEMU NVMe Ctrl (12340 ) 00:09:39.243 [2024-12-15 04:59:59.044124] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 [2024-12-15 04:59:59.044185] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 [2024-12-15 04:59:59.044201] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 [2024-12-15 04:59:59.044217] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:09:39.243 [2024-12-15 04:59:59.068985] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:11.0, 0] in failed state. 00:09:39.243 Controller removed: QEMU NVMe Ctrl (12341 ) 00:09:39.243 [2024-12-15 04:59:59.070088] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 [2024-12-15 04:59:59.070143] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 [2024-12-15 04:59:59.070163] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 [2024-12-15 04:59:59.070178] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 unregister_dev: QEMU NVMe Ctrl (12341 ) 00:09:39.243 [2024-12-15 04:59:59.071455] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 [2024-12-15 04:59:59.071500] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 [2024-12-15 04:59:59.071520] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 [2024-12-15 04:59:59.071534] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:39.243 EAL: eal_parse_sysfs_value(): cannot open sysfs value /sys/bus/pci/devices/0000:00:11.0/vendor 00:09:39.243 EAL: Scan for (pci) bus failed. 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@43 -- # false 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@56 -- # echo 1 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:10.0 00:09:39.243 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:10.0 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:11.0 00:09:39.243 Attaching to 0000:00:10.0 00:09:39.243 Attached to 0000:00:10.0 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:11.0 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:09:39.243 04:59:59 sw_hotplug -- nvme/sw_hotplug.sh@66 -- # sleep 12 00:09:39.243 Attaching to 0000:00:11.0 00:09:39.243 Attached to 0000:00:11.0 00:09:40.183 QEMU NVMe Ctrl (12340 ): 3746 I/Os completed (+3746) 00:09:40.183 QEMU NVMe Ctrl (12341 ): 3512 I/Os completed (+3512) 00:09:40.183 00:09:41.125 QEMU NVMe Ctrl (12340 ): 7430 I/Os completed (+3684) 00:09:41.125 QEMU NVMe Ctrl (12341 ): 7170 I/Os completed (+3658) 00:09:41.125 00:09:42.510 QEMU NVMe Ctrl (12340 ): 10546 I/Os completed (+3116) 00:09:42.510 QEMU NVMe Ctrl (12341 ): 10288 I/Os completed (+3118) 00:09:42.510 00:09:43.453 QEMU NVMe Ctrl (12340 ): 13653 I/Os completed (+3107) 00:09:43.453 QEMU NVMe Ctrl (12341 ): 13400 I/Os completed (+3112) 00:09:43.453 00:09:44.394 QEMU NVMe Ctrl (12340 ): 17585 I/Os completed (+3932) 00:09:44.394 QEMU NVMe Ctrl (12341 ): 17335 I/Os completed (+3935) 00:09:44.394 00:09:45.336 QEMU NVMe Ctrl (12340 ): 21092 I/Os completed (+3507) 00:09:45.336 QEMU NVMe Ctrl (12341 ): 20945 I/Os completed (+3610) 00:09:45.336 00:09:46.277 QEMU NVMe Ctrl (12340 ): 24666 I/Os completed (+3574) 00:09:46.277 QEMU NVMe Ctrl (12341 ): 24653 I/Os completed (+3708) 00:09:46.277 00:09:47.220 QEMU NVMe Ctrl (12340 ): 28079 I/Os completed (+3413) 00:09:47.220 QEMU NVMe Ctrl (12341 ): 28106 I/Os completed (+3453) 00:09:47.220 00:09:48.163 QEMU NVMe Ctrl (12340 ): 31247 I/Os completed (+3168) 00:09:48.163 QEMU NVMe Ctrl (12341 ): 31274 I/Os completed (+3168) 00:09:48.163 00:09:49.105 QEMU NVMe Ctrl (12340 ): 34811 I/Os completed (+3564) 00:09:49.105 QEMU NVMe Ctrl (12341 ): 34838 I/Os completed (+3564) 00:09:49.105 00:09:50.499 QEMU NVMe Ctrl (12340 ): 37731 I/Os completed (+2920) 00:09:50.499 QEMU NVMe Ctrl (12341 ): 37762 I/Os completed (+2924) 00:09:50.499 00:09:51.442 QEMU NVMe Ctrl (12340 ): 40783 I/Os completed (+3052) 00:09:51.442 QEMU NVMe Ctrl (12341 ): 40816 I/Os completed (+3054) 00:09:51.442 00:09:51.442 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@68 -- # false 00:09:51.442 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@38 -- # (( hotplug_events-- )) 00:09:51.442 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:09:51.442 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:09:51.442 [2024-12-15 05:00:11.345557] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:10.0, 0] in failed state. 00:09:51.442 Controller removed: QEMU NVMe Ctrl (12340 ) 00:09:51.442 [2024-12-15 05:00:11.346767] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 [2024-12-15 05:00:11.346818] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 [2024-12-15 05:00:11.346836] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 [2024-12-15 05:00:11.346855] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 unregister_dev: QEMU NVMe Ctrl (12340 ) 00:09:51.442 [2024-12-15 05:00:11.348200] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 [2024-12-15 05:00:11.348256] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 [2024-12-15 05:00:11.348270] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 [2024-12-15 05:00:11.348286] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:09:51.442 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:09:51.442 [2024-12-15 05:00:11.371022] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:11.0, 0] in failed state. 00:09:51.442 Controller removed: QEMU NVMe Ctrl (12341 ) 00:09:51.442 [2024-12-15 05:00:11.372162] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 [2024-12-15 05:00:11.372215] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 [2024-12-15 05:00:11.372235] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 [2024-12-15 05:00:11.372250] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 unregister_dev: QEMU NVMe Ctrl (12341 ) 00:09:51.442 [2024-12-15 05:00:11.373511] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 [2024-12-15 05:00:11.373555] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 [2024-12-15 05:00:11.373573] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 [2024-12-15 05:00:11.373585] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:09:51.442 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@43 -- # false 00:09:51.442 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@56 -- # echo 1 00:09:51.442 EAL: eal_parse_sysfs_value(): cannot open sysfs value /sys/bus/pci/devices/0000:00:11.0/vendor 00:09:51.442 EAL: Scan for (pci) bus failed. 00:09:51.442 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:09:51.442 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:09:51.442 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:10.0 00:09:51.442 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:10.0 00:09:51.703 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:09:51.703 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:09:51.703 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:09:51.703 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:11.0 00:09:51.703 Attaching to 0000:00:10.0 00:09:51.703 Attached to 0000:00:10.0 00:09:51.703 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:11.0 00:09:51.703 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:09:51.703 05:00:11 sw_hotplug -- nvme/sw_hotplug.sh@66 -- # sleep 12 00:09:51.703 Attaching to 0000:00:11.0 00:09:51.703 Attached to 0000:00:11.0 00:09:52.277 QEMU NVMe Ctrl (12340 ): 1867 I/Os completed (+1867) 00:09:52.277 QEMU NVMe Ctrl (12341 ): 1642 I/Os completed (+1642) 00:09:52.277 00:09:53.228 QEMU NVMe Ctrl (12340 ): 4915 I/Os completed (+3048) 00:09:53.228 QEMU NVMe Ctrl (12341 ): 4690 I/Os completed (+3048) 00:09:53.228 00:09:54.172 QEMU NVMe Ctrl (12340 ): 7883 I/Os completed (+2968) 00:09:54.172 QEMU NVMe Ctrl (12341 ): 7666 I/Os completed (+2976) 00:09:54.172 00:09:55.116 QEMU NVMe Ctrl (12340 ): 10927 I/Os completed (+3044) 00:09:55.116 QEMU NVMe Ctrl (12341 ): 10710 I/Os completed (+3044) 00:09:55.116 00:09:56.523 QEMU NVMe Ctrl (12340 ): 13940 I/Os completed (+3013) 00:09:56.523 QEMU NVMe Ctrl (12341 ): 13732 I/Os completed (+3022) 00:09:56.523 00:09:57.095 QEMU NVMe Ctrl (12340 ): 16841 I/Os completed (+2901) 00:09:57.095 QEMU NVMe Ctrl (12341 ): 16683 I/Os completed (+2951) 00:09:57.095 00:09:58.480 QEMU NVMe Ctrl (12340 ): 19853 I/Os completed (+3012) 00:09:58.480 QEMU NVMe Ctrl (12341 ): 19691 I/Os completed (+3008) 00:09:58.480 00:09:59.423 QEMU NVMe Ctrl (12340 ): 22805 I/Os completed (+2952) 00:09:59.423 QEMU NVMe Ctrl (12341 ): 22645 I/Os completed (+2954) 00:09:59.423 00:10:00.367 QEMU NVMe Ctrl (12340 ): 25502 I/Os completed (+2697) 00:10:00.367 QEMU NVMe Ctrl (12341 ): 25422 I/Os completed (+2777) 00:10:00.367 00:10:01.308 QEMU NVMe Ctrl (12340 ): 29336 I/Os completed (+3834) 00:10:01.308 QEMU NVMe Ctrl (12341 ): 29330 I/Os completed (+3908) 00:10:01.308 00:10:02.251 QEMU NVMe Ctrl (12340 ): 32643 I/Os completed (+3307) 00:10:02.251 QEMU NVMe Ctrl (12341 ): 32820 I/Os completed (+3490) 00:10:02.251 00:10:03.195 QEMU NVMe Ctrl (12340 ): 35334 I/Os completed (+2691) 00:10:03.195 QEMU NVMe Ctrl (12341 ): 35517 I/Os completed (+2697) 00:10:03.195 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@68 -- # false 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@38 -- # (( hotplug_events-- )) 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:10:03.768 [2024-12-15 05:00:23.667388] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:10.0, 0] in failed state. 00:10:03.768 Controller removed: QEMU NVMe Ctrl (12340 ) 00:10:03.768 [2024-12-15 05:00:23.668778] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 [2024-12-15 05:00:23.668852] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 [2024-12-15 05:00:23.668870] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 [2024-12-15 05:00:23.668898] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 unregister_dev: QEMU NVMe Ctrl (12340 ) 00:10:03.768 [2024-12-15 05:00:23.670600] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 [2024-12-15 05:00:23.670675] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 [2024-12-15 05:00:23.670692] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 [2024-12-15 05:00:23.670711] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:10:03.768 [2024-12-15 05:00:23.689218] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:11.0, 0] in failed state. 00:10:03.768 Controller removed: QEMU NVMe Ctrl (12341 ) 00:10:03.768 [2024-12-15 05:00:23.690326] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 [2024-12-15 05:00:23.690373] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 [2024-12-15 05:00:23.690393] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 [2024-12-15 05:00:23.690409] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 unregister_dev: QEMU NVMe Ctrl (12341 ) 00:10:03.768 [2024-12-15 05:00:23.691737] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 [2024-12-15 05:00:23.691782] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 [2024-12-15 05:00:23.691800] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 [2024-12-15 05:00:23.691814] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@43 -- # false 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@56 -- # echo 1 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:10.0 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:10.0 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:10:03.768 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:11.0 00:10:03.768 Attaching to 0000:00:10.0 00:10:04.030 Attached to 0000:00:10.0 00:10:04.030 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:11.0 00:10:04.030 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:10:04.030 05:00:23 sw_hotplug -- nvme/sw_hotplug.sh@66 -- # sleep 12 00:10:04.030 Attaching to 0000:00:11.0 00:10:04.030 Attached to 0000:00:11.0 00:10:04.030 unregister_dev: QEMU NVMe Ctrl (12340 ) 00:10:04.030 unregister_dev: QEMU NVMe Ctrl (12341 ) 00:10:04.030 [2024-12-15 05:00:23.984038] rpc.c: 409:spdk_rpc_close: *WARNING*: spdk_rpc_close: deprecated feature spdk_rpc_close is deprecated to be removed in v24.09 00:10:16.266 05:00:35 sw_hotplug -- nvme/sw_hotplug.sh@68 -- # false 00:10:16.266 05:00:35 sw_hotplug -- nvme/sw_hotplug.sh@38 -- # (( hotplug_events-- )) 00:10:16.266 05:00:35 sw_hotplug -- common/autotest_common.sh@719 -- # time=42.94 00:10:16.266 05:00:35 sw_hotplug -- common/autotest_common.sh@720 -- # echo 42.94 00:10:16.266 05:00:35 sw_hotplug -- common/autotest_common.sh@722 -- # return 0 00:10:16.266 05:00:35 sw_hotplug -- nvme/sw_hotplug.sh@21 -- # helper_time=42.94 00:10:16.266 05:00:35 sw_hotplug -- nvme/sw_hotplug.sh@22 -- # printf 'remove_attach_helper took %ss to complete (handling %u nvme drive(s))' 42.94 2 00:10:16.266 remove_attach_helper took 42.94s to complete (handling 2 nvme drive(s)) 05:00:35 sw_hotplug -- nvme/sw_hotplug.sh@91 -- # sleep 6 00:10:22.843 05:00:41 sw_hotplug -- nvme/sw_hotplug.sh@93 -- # kill -0 80284 00:10:22.843 /home/vagrant/spdk_repo/spdk/test/nvme/sw_hotplug.sh: line 93: kill: (80284) - No such process 00:10:22.843 05:00:41 sw_hotplug -- nvme/sw_hotplug.sh@95 -- # wait 80284 00:10:22.843 05:00:41 sw_hotplug -- nvme/sw_hotplug.sh@102 -- # trap - SIGINT SIGTERM EXIT 00:10:22.843 05:00:41 sw_hotplug -- nvme/sw_hotplug.sh@151 -- # tgt_run_hotplug 00:10:22.843 05:00:41 sw_hotplug -- nvme/sw_hotplug.sh@107 -- # local dev 00:10:22.843 05:00:41 sw_hotplug -- nvme/sw_hotplug.sh@110 -- # spdk_tgt_pid=80834 00:10:22.843 05:00:41 sw_hotplug -- nvme/sw_hotplug.sh@112 -- # trap 'killprocess ${spdk_tgt_pid}; echo 1 > /sys/bus/pci/rescan; exit 1' SIGINT SIGTERM EXIT 00:10:22.843 05:00:41 sw_hotplug -- nvme/sw_hotplug.sh@113 -- # waitforlisten 80834 00:10:22.843 05:00:41 sw_hotplug -- nvme/sw_hotplug.sh@109 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:10:22.843 05:00:41 sw_hotplug -- common/autotest_common.sh@835 -- # '[' -z 80834 ']' 00:10:22.843 05:00:41 sw_hotplug -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:22.843 05:00:41 sw_hotplug -- common/autotest_common.sh@840 -- # local max_retries=100 00:10:22.843 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:22.843 05:00:41 sw_hotplug -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:22.843 05:00:41 sw_hotplug -- common/autotest_common.sh@844 -- # xtrace_disable 00:10:22.843 05:00:41 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:10:22.843 [2024-12-15 05:00:42.075753] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:10:22.843 [2024-12-15 05:00:42.075903] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid80834 ] 00:10:22.843 [2024-12-15 05:00:42.237643] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:22.843 [2024-12-15 05:00:42.266185] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:10:22.843 05:00:42 sw_hotplug -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:10:22.843 05:00:42 sw_hotplug -- common/autotest_common.sh@868 -- # return 0 00:10:22.843 05:00:42 sw_hotplug -- nvme/sw_hotplug.sh@115 -- # rpc_cmd bdev_nvme_set_hotplug -e 00:10:22.843 05:00:42 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:10:22.843 05:00:42 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:10:22.843 05:00:42 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:10:22.843 05:00:42 sw_hotplug -- nvme/sw_hotplug.sh@117 -- # debug_remove_attach_helper 3 6 true 00:10:22.843 05:00:42 sw_hotplug -- nvme/sw_hotplug.sh@19 -- # local helper_time=0 00:10:22.843 05:00:42 sw_hotplug -- nvme/sw_hotplug.sh@21 -- # timing_cmd remove_attach_helper 3 6 true 00:10:22.843 05:00:42 sw_hotplug -- common/autotest_common.sh@709 -- # local cmd_es=0 00:10:22.843 05:00:42 sw_hotplug -- common/autotest_common.sh@711 -- # [[ -t 0 ]] 00:10:22.843 05:00:42 sw_hotplug -- common/autotest_common.sh@711 -- # exec 00:10:22.843 05:00:42 sw_hotplug -- common/autotest_common.sh@713 -- # local time=0 TIMEFORMAT=%2R 00:10:22.843 05:00:42 sw_hotplug -- common/autotest_common.sh@719 -- # remove_attach_helper 3 6 true 00:10:22.843 05:00:42 sw_hotplug -- nvme/sw_hotplug.sh@27 -- # local hotplug_events=3 00:10:22.843 05:00:42 sw_hotplug -- nvme/sw_hotplug.sh@28 -- # local hotplug_wait=6 00:10:22.843 05:00:42 sw_hotplug -- nvme/sw_hotplug.sh@29 -- # local use_bdev=true 00:10:22.843 05:00:42 sw_hotplug -- nvme/sw_hotplug.sh@30 -- # local dev bdfs 00:10:22.843 05:00:42 sw_hotplug -- nvme/sw_hotplug.sh@36 -- # sleep 6 00:10:29.432 05:00:48 sw_hotplug -- nvme/sw_hotplug.sh@38 -- # (( hotplug_events-- )) 00:10:29.432 05:00:48 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:10:29.432 05:00:48 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:10:29.432 05:00:48 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:10:29.432 05:00:48 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:10:29.432 05:00:48 sw_hotplug -- nvme/sw_hotplug.sh@43 -- # true 00:10:29.432 05:00:48 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdfs=($(bdev_bdfs)) 00:10:29.432 05:00:48 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdev_bdfs 00:10:29.432 05:00:48 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:10:29.432 05:00:48 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:10:29.432 05:00:48 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:10:29.432 05:00:48 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:10:29.432 05:00:48 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:10:29.432 05:00:49 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:10:29.432 05:00:49 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # (( 2 > 0 )) 00:10:29.432 05:00:49 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # sleep 0.5 00:10:29.432 [2024-12-15 05:00:49.032647] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:10.0, 0] in failed state. 00:10:29.432 [2024-12-15 05:00:49.033722] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:29.432 [2024-12-15 05:00:49.033755] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:190 nsid:0 cdw10:00000000 cdw11:00000000 00:10:29.432 [2024-12-15 05:00:49.033770] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:190 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:29.432 [2024-12-15 05:00:49.033783] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:29.432 [2024-12-15 05:00:49.033792] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:189 nsid:0 cdw10:00000000 cdw11:00000000 00:10:29.432 [2024-12-15 05:00:49.033799] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:189 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:29.432 [2024-12-15 05:00:49.033808] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:29.432 [2024-12-15 05:00:49.033814] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:188 nsid:0 cdw10:00000000 cdw11:00000000 00:10:29.432 [2024-12-15 05:00:49.033822] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:188 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:29.432 [2024-12-15 05:00:49.033829] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:29.432 [2024-12-15 05:00:49.033836] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:187 nsid:0 cdw10:00000000 cdw11:00000000 00:10:29.432 [2024-12-15 05:00:49.033842] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:187 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:29.432 05:00:49 sw_hotplug -- nvme/sw_hotplug.sh@51 -- # printf 'Still waiting for %s to be gone\n' 0000:00:10.0 0000:00:11.0 00:10:29.432 05:00:49 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdfs=($(bdev_bdfs)) 00:10:29.432 05:00:49 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdev_bdfs 00:10:29.432 05:00:49 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:10:29.432 05:00:49 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:10:29.432 05:00:49 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:10:29.432 05:00:49 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:10:29.432 05:00:49 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:10:29.432 [2024-12-15 05:00:49.532661] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:11.0, 0] in failed state. 00:10:29.432 05:00:49 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:10:29.432 [2024-12-15 05:00:49.534178] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:29.432 [2024-12-15 05:00:49.534214] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:190 nsid:0 cdw10:00000000 cdw11:00000000 00:10:29.432 [2024-12-15 05:00:49.534225] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:190 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:29.432 [2024-12-15 05:00:49.534238] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:29.432 [2024-12-15 05:00:49.534245] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:189 nsid:0 cdw10:00000000 cdw11:00000000 00:10:29.432 [2024-12-15 05:00:49.534253] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:189 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:29.432 [2024-12-15 05:00:49.534260] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:29.432 [2024-12-15 05:00:49.534267] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:188 nsid:0 cdw10:00000000 cdw11:00000000 00:10:29.432 [2024-12-15 05:00:49.534274] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:188 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:29.432 [2024-12-15 05:00:49.534284] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:29.432 [2024-12-15 05:00:49.534290] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:187 nsid:0 cdw10:00000000 cdw11:00000000 00:10:29.432 [2024-12-15 05:00:49.534297] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:187 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:29.432 05:00:49 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # (( 1 > 0 )) 00:10:29.432 05:00:49 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # sleep 0.5 00:10:30.004 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@51 -- # printf 'Still waiting for %s to be gone\n' 0000:00:11.0 00:10:30.004 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdfs=($(bdev_bdfs)) 00:10:30.004 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdev_bdfs 00:10:30.004 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:10:30.004 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:10:30.004 05:00:50 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:10:30.004 05:00:50 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:10:30.004 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:10:30.004 05:00:50 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:10:30.004 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # (( 0 > 0 )) 00:10:30.004 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@56 -- # echo 1 00:10:30.265 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:10:30.265 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:10:30.265 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:10.0 00:10:30.265 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:10.0 00:10:30.265 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:10:30.265 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:10:30.265 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:10:30.265 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:11.0 00:10:30.265 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:11.0 00:10:30.265 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:10:30.265 05:00:50 sw_hotplug -- nvme/sw_hotplug.sh@66 -- # sleep 12 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@68 -- # true 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@70 -- # bdfs=($(bdev_bdfs)) 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@70 -- # bdev_bdfs 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:10:42.502 05:01:02 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:10:42.502 05:01:02 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:10:42.502 05:01:02 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@71 -- # [[ 0000:00:10.0 0000:00:11.0 == \0\0\0\0\:\0\0\:\1\0\.\0\ \0\0\0\0\:\0\0\:\1\1\.\0 ]] 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@38 -- # (( hotplug_events-- )) 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@43 -- # true 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdfs=($(bdev_bdfs)) 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdev_bdfs 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:10:42.502 05:01:02 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:10:42.502 05:01:02 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:10:42.502 05:01:02 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # (( 2 > 0 )) 00:10:42.502 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # sleep 0.5 00:10:42.502 [2024-12-15 05:01:02.432819] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:10.0, 0] in failed state. 00:10:42.502 [2024-12-15 05:01:02.433860] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:42.502 [2024-12-15 05:01:02.433893] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:190 nsid:0 cdw10:00000000 cdw11:00000000 00:10:42.502 [2024-12-15 05:01:02.433906] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:190 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:42.502 [2024-12-15 05:01:02.433919] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:42.502 [2024-12-15 05:01:02.433929] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:189 nsid:0 cdw10:00000000 cdw11:00000000 00:10:42.502 [2024-12-15 05:01:02.433936] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:189 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:42.502 [2024-12-15 05:01:02.433944] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:42.502 [2024-12-15 05:01:02.433964] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:188 nsid:0 cdw10:00000000 cdw11:00000000 00:10:42.502 [2024-12-15 05:01:02.433972] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:188 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:42.502 [2024-12-15 05:01:02.433978] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:42.502 [2024-12-15 05:01:02.433986] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:187 nsid:0 cdw10:00000000 cdw11:00000000 00:10:42.502 [2024-12-15 05:01:02.433992] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:187 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:42.767 [2024-12-15 05:01:02.832823] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:11.0, 0] in failed state. 00:10:42.767 [2024-12-15 05:01:02.833853] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:42.767 [2024-12-15 05:01:02.833883] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:190 nsid:0 cdw10:00000000 cdw11:00000000 00:10:42.767 [2024-12-15 05:01:02.833894] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:190 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:42.767 [2024-12-15 05:01:02.833905] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:42.767 [2024-12-15 05:01:02.833912] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:189 nsid:0 cdw10:00000000 cdw11:00000000 00:10:42.767 [2024-12-15 05:01:02.833920] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:189 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:42.767 [2024-12-15 05:01:02.833927] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:42.767 [2024-12-15 05:01:02.833934] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:188 nsid:0 cdw10:00000000 cdw11:00000000 00:10:42.767 [2024-12-15 05:01:02.833940] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:188 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:42.767 [2024-12-15 05:01:02.833948] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:42.767 [2024-12-15 05:01:02.833955] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:187 nsid:0 cdw10:00000000 cdw11:00000000 00:10:42.767 [2024-12-15 05:01:02.833962] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:187 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:43.027 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@51 -- # printf 'Still waiting for %s to be gone\n' 0000:00:10.0 0000:00:11.0 00:10:43.027 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdfs=($(bdev_bdfs)) 00:10:43.027 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdev_bdfs 00:10:43.027 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:10:43.027 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:10:43.027 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:10:43.027 05:01:02 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:10:43.027 05:01:02 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:10:43.027 05:01:02 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:10:43.027 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # (( 0 > 0 )) 00:10:43.027 05:01:02 sw_hotplug -- nvme/sw_hotplug.sh@56 -- # echo 1 00:10:43.027 05:01:03 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:10:43.027 05:01:03 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:10:43.027 05:01:03 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:10.0 00:10:43.027 05:01:03 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:10.0 00:10:43.027 05:01:03 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:10:43.027 05:01:03 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:10:43.027 05:01:03 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:10:43.027 05:01:03 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:11.0 00:10:43.288 05:01:03 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:11.0 00:10:43.288 05:01:03 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:10:43.288 05:01:03 sw_hotplug -- nvme/sw_hotplug.sh@66 -- # sleep 12 00:10:55.524 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@68 -- # true 00:10:55.524 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@70 -- # bdfs=($(bdev_bdfs)) 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@70 -- # bdev_bdfs 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:10:55.525 05:01:15 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:10:55.525 05:01:15 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:10:55.525 05:01:15 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@71 -- # [[ 0000:00:10.0 0000:00:11.0 == \0\0\0\0\:\0\0\:\1\0\.\0\ \0\0\0\0\:\0\0\:\1\1\.\0 ]] 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@38 -- # (( hotplug_events-- )) 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@43 -- # true 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdfs=($(bdev_bdfs)) 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdev_bdfs 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:10:55.525 05:01:15 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:10:55.525 05:01:15 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:10:55.525 05:01:15 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # (( 2 > 0 )) 00:10:55.525 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # sleep 0.5 00:10:55.525 [2024-12-15 05:01:15.333009] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:10.0, 0] in failed state. 00:10:55.525 [2024-12-15 05:01:15.334017] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:55.525 [2024-12-15 05:01:15.334042] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:190 nsid:0 cdw10:00000000 cdw11:00000000 00:10:55.525 [2024-12-15 05:01:15.334056] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:190 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:55.525 [2024-12-15 05:01:15.334067] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:55.525 [2024-12-15 05:01:15.334075] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:189 nsid:0 cdw10:00000000 cdw11:00000000 00:10:55.525 [2024-12-15 05:01:15.334082] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:189 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:55.525 [2024-12-15 05:01:15.334090] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:55.525 [2024-12-15 05:01:15.334096] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:188 nsid:0 cdw10:00000000 cdw11:00000000 00:10:55.525 [2024-12-15 05:01:15.334104] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:188 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:55.525 [2024-12-15 05:01:15.334111] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:55.525 [2024-12-15 05:01:15.334118] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:187 nsid:0 cdw10:00000000 cdw11:00000000 00:10:55.525 [2024-12-15 05:01:15.334125] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:187 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:55.786 [2024-12-15 05:01:15.733011] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:11.0, 0] in failed state. 00:10:55.786 [2024-12-15 05:01:15.733976] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:55.786 [2024-12-15 05:01:15.734002] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:190 nsid:0 cdw10:00000000 cdw11:00000000 00:10:55.786 [2024-12-15 05:01:15.734011] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:190 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:55.786 [2024-12-15 05:01:15.734021] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:55.786 [2024-12-15 05:01:15.734028] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:189 nsid:0 cdw10:00000000 cdw11:00000000 00:10:55.786 [2024-12-15 05:01:15.734037] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:189 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:55.786 [2024-12-15 05:01:15.734044] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:55.786 [2024-12-15 05:01:15.734052] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:188 nsid:0 cdw10:00000000 cdw11:00000000 00:10:55.786 [2024-12-15 05:01:15.734058] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:188 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:55.786 [2024-12-15 05:01:15.734066] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:10:55.786 [2024-12-15 05:01:15.734072] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:187 nsid:0 cdw10:00000000 cdw11:00000000 00:10:55.786 [2024-12-15 05:01:15.734079] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:187 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:10:55.786 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@51 -- # printf 'Still waiting for %s to be gone\n' 0000:00:10.0 0000:00:11.0 00:10:55.786 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdfs=($(bdev_bdfs)) 00:10:55.786 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdev_bdfs 00:10:55.786 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:10:55.786 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:10:55.786 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:10:55.786 05:01:15 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:10:55.786 05:01:15 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:10:55.786 05:01:15 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:10:55.786 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # (( 0 > 0 )) 00:10:55.786 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@56 -- # echo 1 00:10:55.786 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:10:55.787 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:10:55.787 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:10.0 00:10:56.048 05:01:15 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:10.0 00:10:56.048 05:01:16 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:10:56.048 05:01:16 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:10:56.048 05:01:16 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:10:56.048 05:01:16 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:11.0 00:10:56.048 05:01:16 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:11.0 00:10:56.048 05:01:16 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:10:56.048 05:01:16 sw_hotplug -- nvme/sw_hotplug.sh@66 -- # sleep 12 00:11:08.283 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@68 -- # true 00:11:08.283 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@70 -- # bdfs=($(bdev_bdfs)) 00:11:08.283 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@70 -- # bdev_bdfs 00:11:08.283 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:11:08.283 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:11:08.283 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:11:08.283 05:01:28 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:11:08.283 05:01:28 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:11:08.283 05:01:28 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:11:08.283 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@71 -- # [[ 0000:00:10.0 0000:00:11.0 == \0\0\0\0\:\0\0\:\1\0\.\0\ \0\0\0\0\:\0\0\:\1\1\.\0 ]] 00:11:08.283 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@38 -- # (( hotplug_events-- )) 00:11:08.283 05:01:28 sw_hotplug -- common/autotest_common.sh@719 -- # time=45.19 00:11:08.283 05:01:28 sw_hotplug -- common/autotest_common.sh@720 -- # echo 45.19 00:11:08.283 05:01:28 sw_hotplug -- common/autotest_common.sh@722 -- # return 0 00:11:08.283 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@21 -- # helper_time=45.19 00:11:08.283 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@22 -- # printf 'remove_attach_helper took %ss to complete (handling %u nvme drive(s))' 45.19 2 00:11:08.283 remove_attach_helper took 45.19s to complete (handling 2 nvme drive(s)) 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@119 -- # rpc_cmd bdev_nvme_set_hotplug -d 00:11:08.283 05:01:28 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:11:08.283 05:01:28 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:11:08.283 05:01:28 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:11:08.283 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@120 -- # rpc_cmd bdev_nvme_set_hotplug -e 00:11:08.283 05:01:28 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:11:08.283 05:01:28 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:11:08.283 05:01:28 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:11:08.283 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@122 -- # debug_remove_attach_helper 3 6 true 00:11:08.283 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@19 -- # local helper_time=0 00:11:08.283 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@21 -- # timing_cmd remove_attach_helper 3 6 true 00:11:08.283 05:01:28 sw_hotplug -- common/autotest_common.sh@709 -- # local cmd_es=0 00:11:08.283 05:01:28 sw_hotplug -- common/autotest_common.sh@711 -- # [[ -t 0 ]] 00:11:08.284 05:01:28 sw_hotplug -- common/autotest_common.sh@711 -- # exec 00:11:08.284 05:01:28 sw_hotplug -- common/autotest_common.sh@713 -- # local time=0 TIMEFORMAT=%2R 00:11:08.284 05:01:28 sw_hotplug -- common/autotest_common.sh@719 -- # remove_attach_helper 3 6 true 00:11:08.284 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@27 -- # local hotplug_events=3 00:11:08.284 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@28 -- # local hotplug_wait=6 00:11:08.284 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@29 -- # local use_bdev=true 00:11:08.284 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@30 -- # local dev bdfs 00:11:08.284 05:01:28 sw_hotplug -- nvme/sw_hotplug.sh@36 -- # sleep 6 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@38 -- # (( hotplug_events-- )) 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@43 -- # true 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdfs=($(bdev_bdfs)) 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdev_bdfs 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:11:14.871 05:01:34 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:11:14.871 05:01:34 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:11:14.871 05:01:34 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # (( 2 > 0 )) 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # sleep 0.5 00:11:14.871 [2024-12-15 05:01:34.246313] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:10.0, 0] in failed state. 00:11:14.871 [2024-12-15 05:01:34.247276] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:14.871 [2024-12-15 05:01:34.247320] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:190 nsid:0 cdw10:00000000 cdw11:00000000 00:11:14.871 [2024-12-15 05:01:34.247332] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:190 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:14.871 [2024-12-15 05:01:34.247343] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:14.871 [2024-12-15 05:01:34.247352] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:189 nsid:0 cdw10:00000000 cdw11:00000000 00:11:14.871 [2024-12-15 05:01:34.247359] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:189 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:14.871 [2024-12-15 05:01:34.247367] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:14.871 [2024-12-15 05:01:34.247374] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:188 nsid:0 cdw10:00000000 cdw11:00000000 00:11:14.871 [2024-12-15 05:01:34.247383] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:188 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:14.871 [2024-12-15 05:01:34.247389] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:14.871 [2024-12-15 05:01:34.247397] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:187 nsid:0 cdw10:00000000 cdw11:00000000 00:11:14.871 [2024-12-15 05:01:34.247403] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:187 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:14.871 [2024-12-15 05:01:34.646314] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:11.0, 0] in failed state. 00:11:14.871 [2024-12-15 05:01:34.648573] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:14.871 [2024-12-15 05:01:34.648695] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:190 nsid:0 cdw10:00000000 cdw11:00000000 00:11:14.871 [2024-12-15 05:01:34.648710] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:190 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:14.871 [2024-12-15 05:01:34.648722] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:14.871 [2024-12-15 05:01:34.648728] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:189 nsid:0 cdw10:00000000 cdw11:00000000 00:11:14.871 [2024-12-15 05:01:34.648737] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:189 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:14.871 [2024-12-15 05:01:34.648743] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:14.871 [2024-12-15 05:01:34.648751] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:188 nsid:0 cdw10:00000000 cdw11:00000000 00:11:14.871 [2024-12-15 05:01:34.648758] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:188 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:14.871 [2024-12-15 05:01:34.648765] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:14.871 [2024-12-15 05:01:34.648772] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:187 nsid:0 cdw10:00000000 cdw11:00000000 00:11:14.871 [2024-12-15 05:01:34.648781] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:187 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@51 -- # printf 'Still waiting for %s to be gone\n' 0000:00:10.0 0000:00:11.0 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdfs=($(bdev_bdfs)) 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdev_bdfs 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:11:14.871 05:01:34 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:11:14.871 05:01:34 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:11:14.871 05:01:34 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # (( 0 > 0 )) 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@56 -- # echo 1 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:10.0 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:10.0 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:11:14.871 05:01:34 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:11.0 00:11:15.133 05:01:35 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:11.0 00:11:15.133 05:01:35 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:11:15.133 05:01:35 sw_hotplug -- nvme/sw_hotplug.sh@66 -- # sleep 12 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@68 -- # true 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@70 -- # bdfs=($(bdev_bdfs)) 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@70 -- # bdev_bdfs 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:11:27.373 05:01:47 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:11:27.373 05:01:47 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:11:27.373 05:01:47 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@71 -- # [[ 0000:00:10.0 0000:00:11.0 == \0\0\0\0\:\0\0\:\1\0\.\0\ \0\0\0\0\:\0\0\:\1\1\.\0 ]] 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@38 -- # (( hotplug_events-- )) 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@43 -- # true 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdfs=($(bdev_bdfs)) 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdev_bdfs 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:11:27.373 05:01:47 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:11:27.373 05:01:47 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:11:27.373 05:01:47 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # (( 2 > 0 )) 00:11:27.373 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # sleep 0.5 00:11:27.373 [2024-12-15 05:01:47.146499] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:10.0, 0] in failed state. 00:11:27.373 [2024-12-15 05:01:47.147354] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:27.373 [2024-12-15 05:01:47.147385] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:190 nsid:0 cdw10:00000000 cdw11:00000000 00:11:27.373 [2024-12-15 05:01:47.147397] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:190 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:27.373 [2024-12-15 05:01:47.147409] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:27.373 [2024-12-15 05:01:47.147418] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:189 nsid:0 cdw10:00000000 cdw11:00000000 00:11:27.373 [2024-12-15 05:01:47.147425] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:189 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:27.373 [2024-12-15 05:01:47.147432] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:27.373 [2024-12-15 05:01:47.147449] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:188 nsid:0 cdw10:00000000 cdw11:00000000 00:11:27.373 [2024-12-15 05:01:47.147457] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:188 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:27.373 [2024-12-15 05:01:47.147463] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:27.373 [2024-12-15 05:01:47.147471] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:187 nsid:0 cdw10:00000000 cdw11:00000000 00:11:27.373 [2024-12-15 05:01:47.147478] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:187 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:27.634 [2024-12-15 05:01:47.546502] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:11.0, 0] in failed state. 00:11:27.634 [2024-12-15 05:01:47.547406] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:27.634 [2024-12-15 05:01:47.547451] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:190 nsid:0 cdw10:00000000 cdw11:00000000 00:11:27.634 [2024-12-15 05:01:47.547461] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:190 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:27.634 [2024-12-15 05:01:47.547473] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:27.634 [2024-12-15 05:01:47.547480] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:189 nsid:0 cdw10:00000000 cdw11:00000000 00:11:27.634 [2024-12-15 05:01:47.547488] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:189 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:27.634 [2024-12-15 05:01:47.547494] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:27.634 [2024-12-15 05:01:47.547502] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:188 nsid:0 cdw10:00000000 cdw11:00000000 00:11:27.634 [2024-12-15 05:01:47.547509] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:188 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:27.634 [2024-12-15 05:01:47.547516] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:27.634 [2024-12-15 05:01:47.547522] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:187 nsid:0 cdw10:00000000 cdw11:00000000 00:11:27.634 [2024-12-15 05:01:47.547530] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:187 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:27.634 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@51 -- # printf 'Still waiting for %s to be gone\n' 0000:00:10.0 0000:00:11.0 00:11:27.634 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdfs=($(bdev_bdfs)) 00:11:27.634 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdev_bdfs 00:11:27.634 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:11:27.634 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:11:27.634 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:11:27.634 05:01:47 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:11:27.634 05:01:47 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:11:27.634 05:01:47 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:11:27.634 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # (( 0 > 0 )) 00:11:27.634 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@56 -- # echo 1 00:11:27.634 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:11:27.634 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:11:27.634 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:10.0 00:11:27.895 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:10.0 00:11:27.895 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:11:27.895 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:11:27.895 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:11:27.895 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:11.0 00:11:27.895 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:11.0 00:11:27.895 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:11:27.895 05:01:47 sw_hotplug -- nvme/sw_hotplug.sh@66 -- # sleep 12 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@68 -- # true 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@70 -- # bdfs=($(bdev_bdfs)) 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@70 -- # bdev_bdfs 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:11:40.186 05:01:59 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:11:40.186 05:01:59 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:11:40.186 05:01:59 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@71 -- # [[ 0000:00:10.0 0000:00:11.0 == \0\0\0\0\:\0\0\:\1\0\.\0\ \0\0\0\0\:\0\0\:\1\1\.\0 ]] 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@38 -- # (( hotplug_events-- )) 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@39 -- # for dev in "${nvmes[@]}" 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@40 -- # echo 1 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@43 -- # true 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdfs=($(bdev_bdfs)) 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdev_bdfs 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:11:40.186 05:01:59 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:11:40.186 05:01:59 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:11:40.186 05:01:59 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:11:40.186 05:02:00 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:11:40.186 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # (( 2 > 0 )) 00:11:40.186 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # sleep 0.5 00:11:40.186 [2024-12-15 05:02:00.046692] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:10.0, 0] in failed state. 00:11:40.186 [2024-12-15 05:02:00.047488] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:40.186 [2024-12-15 05:02:00.047513] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:190 nsid:0 cdw10:00000000 cdw11:00000000 00:11:40.186 [2024-12-15 05:02:00.047526] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:190 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:40.186 [2024-12-15 05:02:00.047538] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:40.186 [2024-12-15 05:02:00.047551] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:189 nsid:0 cdw10:00000000 cdw11:00000000 00:11:40.186 [2024-12-15 05:02:00.047558] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:189 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:40.186 [2024-12-15 05:02:00.047567] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:40.186 [2024-12-15 05:02:00.047574] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:188 nsid:0 cdw10:00000000 cdw11:00000000 00:11:40.186 [2024-12-15 05:02:00.047582] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:188 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:40.186 [2024-12-15 05:02:00.047588] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:40.186 [2024-12-15 05:02:00.047596] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:187 nsid:0 cdw10:00000000 cdw11:00000000 00:11:40.186 [2024-12-15 05:02:00.047603] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:187 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:40.447 [2024-12-15 05:02:00.446696] nvme_ctrlr.c:1110:nvme_ctrlr_fail: *ERROR*: [0000:00:11.0, 0] in failed state. 00:11:40.447 [2024-12-15 05:02:00.447422] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:40.447 [2024-12-15 05:02:00.447461] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:190 nsid:0 cdw10:00000000 cdw11:00000000 00:11:40.447 [2024-12-15 05:02:00.447470] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:190 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:40.447 [2024-12-15 05:02:00.447481] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:40.447 [2024-12-15 05:02:00.447488] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:189 nsid:0 cdw10:00000000 cdw11:00000000 00:11:40.447 [2024-12-15 05:02:00.447496] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:189 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:40.447 [2024-12-15 05:02:00.447502] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:40.447 [2024-12-15 05:02:00.447512] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:188 nsid:0 cdw10:00000000 cdw11:00000000 00:11:40.447 [2024-12-15 05:02:00.447518] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:188 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:40.447 [2024-12-15 05:02:00.447526] nvme_pcie_common.c: 782:nvme_pcie_qpair_abort_trackers: *ERROR*: aborting outstanding command 00:11:40.447 [2024-12-15 05:02:00.447532] nvme_qpair.c: 223:nvme_admin_qpair_print_command: *NOTICE*: ASYNC EVENT REQUEST (0c) qid:0 cid:187 nsid:0 cdw10:00000000 cdw11:00000000 00:11:40.447 [2024-12-15 05:02:00.447540] nvme_qpair.c: 474:spdk_nvme_print_completion: *NOTICE*: ABORTED - BY REQUEST (00/07) qid:0 cid:187 cdw0:0 sqhd:0000 p:0 m:0 dnr:0 00:11:40.447 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@51 -- # printf 'Still waiting for %s to be gone\n' 0000:00:10.0 0000:00:11.0 00:11:40.447 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdfs=($(bdev_bdfs)) 00:11:40.447 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # bdev_bdfs 00:11:40.447 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:11:40.447 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:11:40.447 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:11:40.447 05:02:00 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:11:40.447 05:02:00 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:11:40.447 05:02:00 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:11:40.447 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@50 -- # (( 0 > 0 )) 00:11:40.447 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@56 -- # echo 1 00:11:40.708 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:11:40.708 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:11:40.708 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:10.0 00:11:40.708 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:10.0 00:11:40.708 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:11:40.708 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@58 -- # for dev in "${nvmes[@]}" 00:11:40.708 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@59 -- # echo uio_pci_generic 00:11:40.708 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@60 -- # echo 0000:00:11.0 00:11:40.708 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@61 -- # echo 0000:00:11.0 00:11:40.708 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@62 -- # echo '' 00:11:40.708 05:02:00 sw_hotplug -- nvme/sw_hotplug.sh@66 -- # sleep 12 00:11:52.943 05:02:12 sw_hotplug -- nvme/sw_hotplug.sh@68 -- # true 00:11:52.943 05:02:12 sw_hotplug -- nvme/sw_hotplug.sh@70 -- # bdfs=($(bdev_bdfs)) 00:11:52.943 05:02:12 sw_hotplug -- nvme/sw_hotplug.sh@70 -- # bdev_bdfs 00:11:52.943 05:02:12 sw_hotplug -- nvme/sw_hotplug.sh@13 -- # sort -u 00:11:52.943 05:02:12 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # jq -r '.[].driver_specific.nvme[].pci_address' /dev/fd/63 00:11:52.943 05:02:12 sw_hotplug -- nvme/sw_hotplug.sh@12 -- # rpc_cmd bdev_get_bdevs 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@563 -- # xtrace_disable 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:11:52.943 05:02:12 sw_hotplug -- nvme/sw_hotplug.sh@71 -- # [[ 0000:00:10.0 0000:00:11.0 == \0\0\0\0\:\0\0\:\1\0\.\0\ \0\0\0\0\:\0\0\:\1\1\.\0 ]] 00:11:52.943 05:02:12 sw_hotplug -- nvme/sw_hotplug.sh@38 -- # (( hotplug_events-- )) 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@719 -- # time=44.67 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@720 -- # echo 44.67 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@722 -- # return 0 00:11:52.943 05:02:12 sw_hotplug -- nvme/sw_hotplug.sh@21 -- # helper_time=44.67 00:11:52.943 05:02:12 sw_hotplug -- nvme/sw_hotplug.sh@22 -- # printf 'remove_attach_helper took %ss to complete (handling %u nvme drive(s))' 44.67 2 00:11:52.943 remove_attach_helper took 44.67s to complete (handling 2 nvme drive(s)) 05:02:12 sw_hotplug -- nvme/sw_hotplug.sh@124 -- # trap - SIGINT SIGTERM EXIT 00:11:52.943 05:02:12 sw_hotplug -- nvme/sw_hotplug.sh@125 -- # killprocess 80834 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@954 -- # '[' -z 80834 ']' 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@958 -- # kill -0 80834 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@959 -- # uname 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 80834 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:11:52.943 killing process with pid 80834 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@972 -- # echo 'killing process with pid 80834' 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@973 -- # kill 80834 00:11:52.943 05:02:12 sw_hotplug -- common/autotest_common.sh@978 -- # wait 80834 00:11:53.204 05:02:13 sw_hotplug -- nvme/sw_hotplug.sh@154 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:11:53.465 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:11:53.726 0000:00:11.0 (1b36 0010): Already using the uio_pci_generic driver 00:11:53.726 0000:00:10.0 (1b36 0010): Already using the uio_pci_generic driver 00:11:53.987 0000:00:13.0 (1b36 0010): nvme -> uio_pci_generic 00:11:53.987 0000:00:12.0 (1b36 0010): nvme -> uio_pci_generic 00:11:53.987 00:11:53.987 real 2m28.656s 00:11:53.987 user 1m49.228s 00:11:53.987 sys 0m17.916s 00:11:53.987 05:02:14 sw_hotplug -- common/autotest_common.sh@1130 -- # xtrace_disable 00:11:53.987 ************************************ 00:11:53.987 END TEST sw_hotplug 00:11:53.987 ************************************ 00:11:53.987 05:02:14 sw_hotplug -- common/autotest_common.sh@10 -- # set +x 00:11:53.987 05:02:14 -- spdk/autotest.sh@243 -- # [[ 1 -eq 1 ]] 00:11:53.987 05:02:14 -- spdk/autotest.sh@244 -- # run_test nvme_xnvme /home/vagrant/spdk_repo/spdk/test/nvme/xnvme/xnvme.sh 00:11:53.987 05:02:14 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:11:53.988 05:02:14 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:11:53.988 05:02:14 -- common/autotest_common.sh@10 -- # set +x 00:11:53.988 ************************************ 00:11:53.988 START TEST nvme_xnvme 00:11:53.988 ************************************ 00:11:53.988 05:02:14 nvme_xnvme -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/nvme/xnvme/xnvme.sh 00:11:54.251 * Looking for test storage... 00:11:54.251 * Found test storage at /home/vagrant/spdk_repo/spdk/test/nvme/xnvme 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@1711 -- # lcov --version 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@333 -- # local ver1 ver1_l 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@334 -- # local ver2 ver2_l 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@336 -- # IFS=.-: 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@336 -- # read -ra ver1 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@337 -- # IFS=.-: 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@337 -- # read -ra ver2 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@338 -- # local 'op=<' 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@340 -- # ver1_l=2 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@341 -- # ver2_l=1 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@344 -- # case "$op" in 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@345 -- # : 1 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@364 -- # (( v = 0 )) 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@365 -- # decimal 1 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@353 -- # local d=1 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@355 -- # echo 1 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@365 -- # ver1[v]=1 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@366 -- # decimal 2 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@353 -- # local d=2 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@355 -- # echo 2 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@366 -- # ver2[v]=2 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:11:54.251 05:02:14 nvme_xnvme -- scripts/common.sh@368 -- # return 0 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:11:54.251 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:54.251 --rc genhtml_branch_coverage=1 00:11:54.251 --rc genhtml_function_coverage=1 00:11:54.251 --rc genhtml_legend=1 00:11:54.251 --rc geninfo_all_blocks=1 00:11:54.251 --rc geninfo_unexecuted_blocks=1 00:11:54.251 00:11:54.251 ' 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:11:54.251 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:54.251 --rc genhtml_branch_coverage=1 00:11:54.251 --rc genhtml_function_coverage=1 00:11:54.251 --rc genhtml_legend=1 00:11:54.251 --rc geninfo_all_blocks=1 00:11:54.251 --rc geninfo_unexecuted_blocks=1 00:11:54.251 00:11:54.251 ' 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:11:54.251 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:54.251 --rc genhtml_branch_coverage=1 00:11:54.251 --rc genhtml_function_coverage=1 00:11:54.251 --rc genhtml_legend=1 00:11:54.251 --rc geninfo_all_blocks=1 00:11:54.251 --rc geninfo_unexecuted_blocks=1 00:11:54.251 00:11:54.251 ' 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:11:54.251 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:54.251 --rc genhtml_branch_coverage=1 00:11:54.251 --rc genhtml_function_coverage=1 00:11:54.251 --rc genhtml_legend=1 00:11:54.251 --rc geninfo_all_blocks=1 00:11:54.251 --rc geninfo_unexecuted_blocks=1 00:11:54.251 00:11:54.251 ' 00:11:54.251 05:02:14 nvme_xnvme -- xnvme/common.sh@10 -- # source /home/vagrant/spdk_repo/spdk/test/dd/common.sh 00:11:54.251 05:02:14 nvme_xnvme -- dd/common.sh@6 -- # source /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@34 -- # set -e 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@36 -- # shopt -s extglob 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@39 -- # '[' -z /home/vagrant/spdk_repo/spdk/../output ']' 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@44 -- # [[ -e /home/vagrant/spdk_repo/spdk/test/common/build_config.sh ]] 00:11:54.251 05:02:14 nvme_xnvme -- common/autotest_common.sh@45 -- # source /home/vagrant/spdk_repo/spdk/test/common/build_config.sh 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@2 -- # CONFIG_ASAN=y 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=n 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@17 -- # CONFIG_MAX_NUMA_NODES=1 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@18 -- # CONFIG_PGO_CAPTURE=n 00:11:54.251 05:02:14 nvme_xnvme -- common/build_config.sh@19 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@20 -- # CONFIG_ENV=/home/vagrant/spdk_repo/spdk/lib/env_dpdk 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@21 -- # CONFIG_LTO=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@22 -- # CONFIG_ISCSI_INITIATOR=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@23 -- # CONFIG_CET=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@24 -- # CONFIG_VBDEV_COMPRESS_MLX5=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@25 -- # CONFIG_OCF_PATH= 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@26 -- # CONFIG_RDMA_SET_TOS=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@27 -- # CONFIG_AIO_FSDEV=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@28 -- # CONFIG_HAVE_ARC4RANDOM=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@29 -- # CONFIG_HAVE_LIBARCHIVE=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@30 -- # CONFIG_UBLK=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@31 -- # CONFIG_ISAL_CRYPTO=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@32 -- # CONFIG_OPENSSL_PATH= 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@33 -- # CONFIG_OCF=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@34 -- # CONFIG_FUSE=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@35 -- # CONFIG_VTUNE_DIR= 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@36 -- # CONFIG_FUZZER_LIB= 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@37 -- # CONFIG_FUZZER=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@38 -- # CONFIG_FSDEV=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@39 -- # CONFIG_DPDK_DIR=/home/vagrant/spdk_repo/dpdk/build 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@40 -- # CONFIG_CRYPTO=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@41 -- # CONFIG_PGO_USE=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@42 -- # CONFIG_VHOST=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@43 -- # CONFIG_DAOS=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@44 -- # CONFIG_DPDK_INC_DIR=//home/vagrant/spdk_repo/dpdk/build/include 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@45 -- # CONFIG_DAOS_DIR= 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@46 -- # CONFIG_UNIT_TESTS=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@47 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@48 -- # CONFIG_VIRTIO=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@49 -- # CONFIG_DPDK_UADK=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@50 -- # CONFIG_COVERAGE=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@51 -- # CONFIG_RDMA=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@52 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIM=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@53 -- # CONFIG_HAVE_LZ4=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@54 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@55 -- # CONFIG_URING_PATH= 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@56 -- # CONFIG_XNVME=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@57 -- # CONFIG_VFIO_USER=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@58 -- # CONFIG_ARCH=native 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@59 -- # CONFIG_HAVE_EVP_MAC=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@60 -- # CONFIG_URING_ZNS=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@61 -- # CONFIG_WERROR=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@62 -- # CONFIG_HAVE_LIBBSD=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@63 -- # CONFIG_UBSAN=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@64 -- # CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@65 -- # CONFIG_IPSEC_MB_DIR= 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@66 -- # CONFIG_GOLANG=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@67 -- # CONFIG_ISAL=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@68 -- # CONFIG_IDXD_KERNEL=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@69 -- # CONFIG_DPDK_LIB_DIR=/home/vagrant/spdk_repo/dpdk/build/lib 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@70 -- # CONFIG_RDMA_PROV=verbs 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@71 -- # CONFIG_APPS=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@72 -- # CONFIG_SHARED=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@73 -- # CONFIG_HAVE_KEYUTILS=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@74 -- # CONFIG_FC_PATH= 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@75 -- # CONFIG_DPDK_PKG_CONFIG=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@76 -- # CONFIG_FC=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@77 -- # CONFIG_AVAHI=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@78 -- # CONFIG_FIO_PLUGIN=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@79 -- # CONFIG_RAID5F=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@80 -- # CONFIG_EXAMPLES=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@81 -- # CONFIG_TESTS=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@82 -- # CONFIG_CRYPTO_MLX5=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@83 -- # CONFIG_MAX_LCORES=128 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@84 -- # CONFIG_IPSEC_MB=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@85 -- # CONFIG_PGO_DIR= 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@86 -- # CONFIG_DEBUG=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@87 -- # CONFIG_DPDK_COMPRESSDEV=n 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@88 -- # CONFIG_CROSS_PREFIX= 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@89 -- # CONFIG_COPY_FILE_RANGE=y 00:11:54.252 05:02:14 nvme_xnvme -- common/build_config.sh@90 -- # CONFIG_URING=n 00:11:54.252 05:02:14 nvme_xnvme -- common/autotest_common.sh@54 -- # source /home/vagrant/spdk_repo/spdk/test/common/applications.sh 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@8 -- # dirname /home/vagrant/spdk_repo/spdk/test/common/applications.sh 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@8 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/common 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@8 -- # _root=/home/vagrant/spdk_repo/spdk/test/common 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@9 -- # _root=/home/vagrant/spdk_repo/spdk 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@10 -- # _app_dir=/home/vagrant/spdk_repo/spdk/build/bin 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@11 -- # _test_app_dir=/home/vagrant/spdk_repo/spdk/test/app 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@12 -- # _examples_dir=/home/vagrant/spdk_repo/spdk/build/examples 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@22 -- # [[ -e /home/vagrant/spdk_repo/spdk/include/spdk/config.h ]] 00:11:54.252 05:02:14 nvme_xnvme -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:11:54.252 #define SPDK_CONFIG_H 00:11:54.252 #define SPDK_CONFIG_AIO_FSDEV 1 00:11:54.252 #define SPDK_CONFIG_APPS 1 00:11:54.252 #define SPDK_CONFIG_ARCH native 00:11:54.252 #define SPDK_CONFIG_ASAN 1 00:11:54.252 #undef SPDK_CONFIG_AVAHI 00:11:54.252 #undef SPDK_CONFIG_CET 00:11:54.252 #define SPDK_CONFIG_COPY_FILE_RANGE 1 00:11:54.252 #define SPDK_CONFIG_COVERAGE 1 00:11:54.252 #define SPDK_CONFIG_CROSS_PREFIX 00:11:54.252 #undef SPDK_CONFIG_CRYPTO 00:11:54.252 #undef SPDK_CONFIG_CRYPTO_MLX5 00:11:54.252 #undef SPDK_CONFIG_CUSTOMOCF 00:11:54.252 #undef SPDK_CONFIG_DAOS 00:11:54.252 #define SPDK_CONFIG_DAOS_DIR 00:11:54.252 #define SPDK_CONFIG_DEBUG 1 00:11:54.252 #undef SPDK_CONFIG_DPDK_COMPRESSDEV 00:11:54.252 #define SPDK_CONFIG_DPDK_DIR /home/vagrant/spdk_repo/dpdk/build 00:11:54.252 #define SPDK_CONFIG_DPDK_INC_DIR //home/vagrant/spdk_repo/dpdk/build/include 00:11:54.252 #define SPDK_CONFIG_DPDK_LIB_DIR /home/vagrant/spdk_repo/dpdk/build/lib 00:11:54.252 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:11:54.252 #undef SPDK_CONFIG_DPDK_UADK 00:11:54.252 #define SPDK_CONFIG_ENV /home/vagrant/spdk_repo/spdk/lib/env_dpdk 00:11:54.252 #define SPDK_CONFIG_EXAMPLES 1 00:11:54.252 #undef SPDK_CONFIG_FC 00:11:54.252 #define SPDK_CONFIG_FC_PATH 00:11:54.252 #define SPDK_CONFIG_FIO_PLUGIN 1 00:11:54.252 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:11:54.252 #define SPDK_CONFIG_FSDEV 1 00:11:54.252 #undef SPDK_CONFIG_FUSE 00:11:54.252 #undef SPDK_CONFIG_FUZZER 00:11:54.252 #define SPDK_CONFIG_FUZZER_LIB 00:11:54.252 #undef SPDK_CONFIG_GOLANG 00:11:54.252 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:11:54.252 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:11:54.252 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:11:54.252 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:11:54.252 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:11:54.252 #undef SPDK_CONFIG_HAVE_LIBBSD 00:11:54.252 #undef SPDK_CONFIG_HAVE_LZ4 00:11:54.252 #define SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIM 1 00:11:54.252 #undef SPDK_CONFIG_HAVE_STRUCT_STAT_ST_ATIMESPEC 00:11:54.252 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:11:54.252 #define SPDK_CONFIG_IDXD 1 00:11:54.252 #define SPDK_CONFIG_IDXD_KERNEL 1 00:11:54.252 #undef SPDK_CONFIG_IPSEC_MB 00:11:54.252 #define SPDK_CONFIG_IPSEC_MB_DIR 00:11:54.252 #define SPDK_CONFIG_ISAL 1 00:11:54.252 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:11:54.252 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:11:54.252 #define SPDK_CONFIG_LIBDIR 00:11:54.252 #undef SPDK_CONFIG_LTO 00:11:54.252 #define SPDK_CONFIG_MAX_LCORES 128 00:11:54.252 #define SPDK_CONFIG_MAX_NUMA_NODES 1 00:11:54.252 #define SPDK_CONFIG_NVME_CUSE 1 00:11:54.252 #undef SPDK_CONFIG_OCF 00:11:54.252 #define SPDK_CONFIG_OCF_PATH 00:11:54.252 #define SPDK_CONFIG_OPENSSL_PATH 00:11:54.252 #undef SPDK_CONFIG_PGO_CAPTURE 00:11:54.252 #define SPDK_CONFIG_PGO_DIR 00:11:54.252 #undef SPDK_CONFIG_PGO_USE 00:11:54.252 #define SPDK_CONFIG_PREFIX /usr/local 00:11:54.252 #undef SPDK_CONFIG_RAID5F 00:11:54.252 #undef SPDK_CONFIG_RBD 00:11:54.252 #define SPDK_CONFIG_RDMA 1 00:11:54.252 #define SPDK_CONFIG_RDMA_PROV verbs 00:11:54.252 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:11:54.252 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:11:54.252 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:11:54.252 #define SPDK_CONFIG_SHARED 1 00:11:54.252 #undef SPDK_CONFIG_SMA 00:11:54.252 #define SPDK_CONFIG_TESTS 1 00:11:54.252 #undef SPDK_CONFIG_TSAN 00:11:54.252 #define SPDK_CONFIG_UBLK 1 00:11:54.252 #define SPDK_CONFIG_UBSAN 1 00:11:54.252 #undef SPDK_CONFIG_UNIT_TESTS 00:11:54.252 #undef SPDK_CONFIG_URING 00:11:54.252 #define SPDK_CONFIG_URING_PATH 00:11:54.252 #undef SPDK_CONFIG_URING_ZNS 00:11:54.252 #undef SPDK_CONFIG_USDT 00:11:54.252 #undef SPDK_CONFIG_VBDEV_COMPRESS 00:11:54.253 #undef SPDK_CONFIG_VBDEV_COMPRESS_MLX5 00:11:54.253 #undef SPDK_CONFIG_VFIO_USER 00:11:54.253 #define SPDK_CONFIG_VFIO_USER_DIR 00:11:54.253 #define SPDK_CONFIG_VHOST 1 00:11:54.253 #define SPDK_CONFIG_VIRTIO 1 00:11:54.253 #undef SPDK_CONFIG_VTUNE 00:11:54.253 #define SPDK_CONFIG_VTUNE_DIR 00:11:54.253 #define SPDK_CONFIG_WERROR 1 00:11:54.253 #define SPDK_CONFIG_WPDK_DIR 00:11:54.253 #define SPDK_CONFIG_XNVME 1 00:11:54.253 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:11:54.253 05:02:14 nvme_xnvme -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@55 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:11:54.253 05:02:14 nvme_xnvme -- scripts/common.sh@15 -- # shopt -s extglob 00:11:54.253 05:02:14 nvme_xnvme -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:11:54.253 05:02:14 nvme_xnvme -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:11:54.253 05:02:14 nvme_xnvme -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:11:54.253 05:02:14 nvme_xnvme -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:54.253 05:02:14 nvme_xnvme -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:54.253 05:02:14 nvme_xnvme -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:54.253 05:02:14 nvme_xnvme -- paths/export.sh@5 -- # export PATH 00:11:54.253 05:02:14 nvme_xnvme -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@56 -- # source /home/vagrant/spdk_repo/spdk/scripts/perf/pm/common 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@6 -- # dirname /home/vagrant/spdk_repo/spdk/scripts/perf/pm/common 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@6 -- # readlink -f /home/vagrant/spdk_repo/spdk/scripts/perf/pm 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@6 -- # _pmdir=/home/vagrant/spdk_repo/spdk/scripts/perf/pm 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@7 -- # readlink -f /home/vagrant/spdk_repo/spdk/scripts/perf/pm/../../../ 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@7 -- # _pmrootdir=/home/vagrant/spdk_repo/spdk 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@64 -- # TEST_TAG=N/A 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@65 -- # TEST_TAG_FILE=/home/vagrant/spdk_repo/spdk/.run_test_name 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@67 -- # PM_OUTPUTDIR=/home/vagrant/spdk_repo/spdk/../output/power 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@68 -- # uname -s 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@68 -- # PM_OS=Linux 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@76 -- # SUDO[0]= 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@76 -- # SUDO[1]='sudo -E' 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@81 -- # [[ Linux == Linux ]] 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@81 -- # [[ QEMU != QEMU ]] 00:11:54.253 05:02:14 nvme_xnvme -- pm/common@88 -- # [[ ! -d /home/vagrant/spdk_repo/spdk/../output/power ]] 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@58 -- # : 1 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@62 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@64 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@66 -- # : 1 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@68 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@70 -- # : 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@72 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@74 -- # : 1 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@76 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@78 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@80 -- # : 1 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@82 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@84 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@86 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@88 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@90 -- # : 1 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@92 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@94 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@96 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@98 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@100 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@102 -- # : rdma 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@104 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@106 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@108 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@110 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@111 -- # export SPDK_TEST_RAID 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@112 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@113 -- # export SPDK_TEST_IOAT 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@114 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@115 -- # export SPDK_TEST_BLOBFS 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@116 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@117 -- # export SPDK_TEST_VHOST_INIT 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@118 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@119 -- # export SPDK_TEST_LVOL 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@120 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@121 -- # export SPDK_TEST_VBDEV_COMPRESS 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@122 -- # : 1 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@123 -- # export SPDK_RUN_ASAN 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@124 -- # : 1 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@125 -- # export SPDK_RUN_UBSAN 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@126 -- # : /home/vagrant/spdk_repo/dpdk/build 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@127 -- # export SPDK_RUN_EXTERNAL_DPDK 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@128 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@129 -- # export SPDK_RUN_NON_ROOT 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@130 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@131 -- # export SPDK_TEST_CRYPTO 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@132 -- # : 1 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@133 -- # export SPDK_TEST_FTL 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@134 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@135 -- # export SPDK_TEST_OCF 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@136 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@137 -- # export SPDK_TEST_VMD 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@138 -- # : 0 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@139 -- # export SPDK_TEST_OPAL 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@140 -- # : v23.11 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@141 -- # export SPDK_TEST_NATIVE_DPDK 00:11:54.253 05:02:14 nvme_xnvme -- common/autotest_common.sh@142 -- # : true 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@143 -- # export SPDK_AUTOTEST_X 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@144 -- # : 0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@146 -- # : 0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@148 -- # : 0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@150 -- # : 0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@152 -- # : 0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@154 -- # : 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@156 -- # : 0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@158 -- # : 0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@160 -- # : 1 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@162 -- # : 0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@164 -- # : 0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@166 -- # : 0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@169 -- # : 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@171 -- # : 0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@173 -- # : 0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@175 -- # : 0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@176 -- # export SPDK_TEST_SETUP 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@177 -- # : 0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@178 -- # export SPDK_TEST_NVME_INTERRUPT 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@181 -- # export SPDK_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/lib 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@181 -- # SPDK_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/lib 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@182 -- # export DPDK_LIB_DIR=/home/vagrant/spdk_repo/dpdk/build/lib 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@182 -- # DPDK_LIB_DIR=/home/vagrant/spdk_repo/dpdk/build/lib 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@183 -- # export VFIO_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@183 -- # VFIO_LIB_DIR=/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@184 -- # export LD_LIBRARY_PATH=:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/spdk/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@184 -- # LD_LIBRARY_PATH=:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/spdk/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib:/home/vagrant/spdk_repo/spdk/build/lib:/home/vagrant/spdk_repo/dpdk/build/lib:/home/vagrant/spdk_repo/spdk/build/libvfio-user/usr/local/lib 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@187 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@187 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@191 -- # export PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@191 -- # PYTHONPATH=:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python:/home/vagrant/spdk_repo/spdk/test/rpc_plugins:/home/vagrant/spdk_repo/spdk/python 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@195 -- # export PYTHONDONTWRITEBYTECODE=1 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@195 -- # PYTHONDONTWRITEBYTECODE=1 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@199 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@199 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@200 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@200 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@204 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@205 -- # rm -rf /var/tmp/asan_suppression_file 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@206 -- # cat 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@242 -- # echo leak:libfuse3.so 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@244 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@244 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@246 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@246 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@248 -- # '[' -z /var/spdk/dependencies ']' 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@251 -- # export DEPENDENCY_DIR 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@255 -- # export SPDK_BIN_DIR=/home/vagrant/spdk_repo/spdk/build/bin 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@255 -- # SPDK_BIN_DIR=/home/vagrant/spdk_repo/spdk/build/bin 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@256 -- # export SPDK_EXAMPLE_DIR=/home/vagrant/spdk_repo/spdk/build/examples 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@256 -- # SPDK_EXAMPLE_DIR=/home/vagrant/spdk_repo/spdk/build/examples 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@259 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@259 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@260 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@260 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@262 -- # export AR_TOOL=/home/vagrant/spdk_repo/spdk/scripts/ar-xnvme-fixer 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@262 -- # AR_TOOL=/home/vagrant/spdk_repo/spdk/scripts/ar-xnvme-fixer 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@265 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@265 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@267 -- # _LCOV_MAIN=0 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@268 -- # _LCOV_LLVM=1 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@269 -- # _LCOV= 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@270 -- # [[ '' == *clang* ]] 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@270 -- # [[ 0 -eq 1 ]] 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@272 -- # _lcov_opt[_LCOV_LLVM]='--gcov-tool /home/vagrant/spdk_repo/spdk/test/fuzz/llvm/llvm-gcov.sh' 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@273 -- # _lcov_opt[_LCOV_MAIN]= 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@275 -- # lcov_opt= 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@278 -- # '[' 0 -eq 0 ']' 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@279 -- # export valgrind= 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@279 -- # valgrind= 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@285 -- # uname -s 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@285 -- # '[' Linux = Linux ']' 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@286 -- # HUGEMEM=4096 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@287 -- # export CLEAR_HUGE=yes 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@287 -- # CLEAR_HUGE=yes 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@289 -- # MAKE=make 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@290 -- # MAKEFLAGS=-j10 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@306 -- # export HUGEMEM=4096 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@306 -- # HUGEMEM=4096 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@308 -- # NO_HUGE=() 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@309 -- # TEST_MODE= 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@331 -- # [[ -z 82179 ]] 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@331 -- # kill -0 82179 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@1696 -- # set_test_storage 2147483648 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@341 -- # [[ -v testdir ]] 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@343 -- # local requested_size=2147483648 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@344 -- # local mount target_dir 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@346 -- # local -A mounts fss sizes avails uses 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@347 -- # local source fs size avail mount use 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@349 -- # local storage_fallback storage_candidates 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@351 -- # mktemp -udt spdk.XXXXXX 00:11:54.254 05:02:14 nvme_xnvme -- common/autotest_common.sh@351 -- # storage_fallback=/tmp/spdk.m4UCGe 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@356 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@358 -- # [[ -n '' ]] 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@363 -- # [[ -n '' ]] 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@368 -- # mkdir -p /home/vagrant/spdk_repo/spdk/test/nvme/xnvme /tmp/spdk.m4UCGe/tests/xnvme /tmp/spdk.m4UCGe 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@371 -- # requested_size=2214592512 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@340 -- # grep -v Filesystem 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@340 -- # df -T 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/vda5 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # fss["$mount"]=btrfs 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # avails["$mount"]=13240651776 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # sizes["$mount"]=20314062848 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@376 -- # uses["$mount"]=6343356416 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # mounts["$mount"]=devtmpfs 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # fss["$mount"]=devtmpfs 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # avails["$mount"]=4194304 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # sizes["$mount"]=4194304 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@376 -- # uses["$mount"]=0 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # avails["$mount"]=6261964800 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # sizes["$mount"]=6265393152 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@376 -- # uses["$mount"]=3428352 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # avails["$mount"]=2493362176 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # sizes["$mount"]=2506158080 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@376 -- # uses["$mount"]=12795904 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/vda5 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # fss["$mount"]=btrfs 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # avails["$mount"]=13240651776 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # sizes["$mount"]=20314062848 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@376 -- # uses["$mount"]=6343356416 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/vda2 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # fss["$mount"]=ext4 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # avails["$mount"]=840085504 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # sizes["$mount"]=1012768768 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@376 -- # uses["$mount"]=103477248 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # avails["$mount"]=6265241600 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # sizes["$mount"]=6265393152 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@376 -- # uses["$mount"]=151552 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # mounts["$mount"]=/dev/vda3 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # fss["$mount"]=vfat 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # avails["$mount"]=91617280 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # sizes["$mount"]=104607744 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@376 -- # uses["$mount"]=12990464 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # mounts["$mount"]=tmpfs 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # fss["$mount"]=tmpfs 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # avails["$mount"]=1253064704 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # sizes["$mount"]=1253076992 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@376 -- # uses["$mount"]=12288 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # mounts["$mount"]=:/mnt/jenkins_nvme/jenkins/workspace/nvme-vg-autotest/fedora39-libvirt/output 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@374 -- # fss["$mount"]=fuse.sshfs 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # avails["$mount"]=98482434048 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@375 -- # sizes["$mount"]=105088212992 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@376 -- # uses["$mount"]=1220345856 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@373 -- # read -r source fs size use avail _ mount 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@379 -- # printf '* Looking for test storage...\n' 00:11:54.255 * Looking for test storage... 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@381 -- # local target_space new_size 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@382 -- # for target_dir in "${storage_candidates[@]}" 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@385 -- # df /home/vagrant/spdk_repo/spdk/test/nvme/xnvme 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@385 -- # awk '$1 !~ /Filesystem/{print $6}' 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@385 -- # mount=/home 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@387 -- # target_space=13240651776 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@388 -- # (( target_space == 0 || target_space < requested_size )) 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@391 -- # (( target_space >= requested_size )) 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@393 -- # [[ btrfs == tmpfs ]] 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@393 -- # [[ btrfs == ramfs ]] 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@393 -- # [[ /home == / ]] 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@400 -- # export SPDK_TEST_STORAGE=/home/vagrant/spdk_repo/spdk/test/nvme/xnvme 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@400 -- # SPDK_TEST_STORAGE=/home/vagrant/spdk_repo/spdk/test/nvme/xnvme 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@401 -- # printf '* Found test storage at %s\n' /home/vagrant/spdk_repo/spdk/test/nvme/xnvme 00:11:54.255 * Found test storage at /home/vagrant/spdk_repo/spdk/test/nvme/xnvme 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@402 -- # return 0 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@1698 -- # set -o errtrace 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@1699 -- # shopt -s extdebug 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@1700 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@1702 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@1703 -- # true 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@1705 -- # xtrace_fd 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@27 -- # exec 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@29 -- # exec 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@31 -- # xtrace_restore 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@18 -- # set -x 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@1711 -- # lcov --version 00:11:54.255 05:02:14 nvme_xnvme -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:11:54.517 05:02:14 nvme_xnvme -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@333 -- # local ver1 ver1_l 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@334 -- # local ver2 ver2_l 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@336 -- # IFS=.-: 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@336 -- # read -ra ver1 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@337 -- # IFS=.-: 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@337 -- # read -ra ver2 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@338 -- # local 'op=<' 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@340 -- # ver1_l=2 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@341 -- # ver2_l=1 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@344 -- # case "$op" in 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@345 -- # : 1 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@364 -- # (( v = 0 )) 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@365 -- # decimal 1 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@353 -- # local d=1 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@355 -- # echo 1 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@365 -- # ver1[v]=1 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@366 -- # decimal 2 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@353 -- # local d=2 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@355 -- # echo 2 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@366 -- # ver2[v]=2 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@368 -- # return 0 00:11:54.517 05:02:14 nvme_xnvme -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:11:54.517 05:02:14 nvme_xnvme -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:11:54.517 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:54.517 --rc genhtml_branch_coverage=1 00:11:54.517 --rc genhtml_function_coverage=1 00:11:54.517 --rc genhtml_legend=1 00:11:54.517 --rc geninfo_all_blocks=1 00:11:54.517 --rc geninfo_unexecuted_blocks=1 00:11:54.517 00:11:54.517 ' 00:11:54.517 05:02:14 nvme_xnvme -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:11:54.517 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:54.517 --rc genhtml_branch_coverage=1 00:11:54.517 --rc genhtml_function_coverage=1 00:11:54.517 --rc genhtml_legend=1 00:11:54.517 --rc geninfo_all_blocks=1 00:11:54.517 --rc geninfo_unexecuted_blocks=1 00:11:54.517 00:11:54.517 ' 00:11:54.517 05:02:14 nvme_xnvme -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:11:54.517 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:54.517 --rc genhtml_branch_coverage=1 00:11:54.517 --rc genhtml_function_coverage=1 00:11:54.517 --rc genhtml_legend=1 00:11:54.517 --rc geninfo_all_blocks=1 00:11:54.517 --rc geninfo_unexecuted_blocks=1 00:11:54.517 00:11:54.517 ' 00:11:54.517 05:02:14 nvme_xnvme -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:11:54.517 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:11:54.517 --rc genhtml_branch_coverage=1 00:11:54.517 --rc genhtml_function_coverage=1 00:11:54.517 --rc genhtml_legend=1 00:11:54.517 --rc geninfo_all_blocks=1 00:11:54.517 --rc geninfo_unexecuted_blocks=1 00:11:54.517 00:11:54.517 ' 00:11:54.517 05:02:14 nvme_xnvme -- dd/common.sh@7 -- # source /home/vagrant/spdk_repo/spdk/scripts/common.sh 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@15 -- # shopt -s extglob 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@544 -- # [[ -e /bin/wpdk_common.sh ]] 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@552 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:11:54.517 05:02:14 nvme_xnvme -- scripts/common.sh@553 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:11:54.517 05:02:14 nvme_xnvme -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:54.517 05:02:14 nvme_xnvme -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:54.517 05:02:14 nvme_xnvme -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:54.517 05:02:14 nvme_xnvme -- paths/export.sh@5 -- # export PATH 00:11:54.517 05:02:14 nvme_xnvme -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@12 -- # xnvme_io=('libaio' 'io_uring' 'io_uring_cmd') 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@12 -- # declare -a xnvme_io 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@18 -- # libaio=('randread' 'randwrite') 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@18 -- # declare -a libaio 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@23 -- # io_uring=('randread' 'randwrite') 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@23 -- # declare -a io_uring 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@27 -- # io_uring_cmd=('randread' 'randwrite' 'unmap' 'write_zeroes') 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@27 -- # declare -a io_uring_cmd 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@33 -- # libaio_fio=('randread' 'randwrite') 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@33 -- # declare -a libaio_fio 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@37 -- # io_uring_fio=('randread' 'randwrite') 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@37 -- # declare -a io_uring_fio 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@41 -- # io_uring_cmd_fio=('randread' 'randwrite') 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@41 -- # declare -a io_uring_cmd_fio 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@45 -- # xnvme_filename=(['libaio']='/dev/nvme0n1' ['io_uring']='/dev/nvme0n1' ['io_uring_cmd']='/dev/ng0n1') 00:11:54.517 05:02:14 nvme_xnvme -- xnvme/common.sh@45 -- # declare -A xnvme_filename 00:11:54.518 05:02:14 nvme_xnvme -- xnvme/common.sh@51 -- # xnvme_conserve_cpu=('false' 'true') 00:11:54.518 05:02:14 nvme_xnvme -- xnvme/common.sh@51 -- # declare -a xnvme_conserve_cpu 00:11:54.518 05:02:14 nvme_xnvme -- xnvme/common.sh@57 -- # method_bdev_xnvme_create_0=(['name']='xnvme_bdev' ['filename']='/dev/nvme0n1' ['io_mechanism']='libaio' ['conserve_cpu']='false') 00:11:54.518 05:02:14 nvme_xnvme -- xnvme/common.sh@57 -- # declare -A method_bdev_xnvme_create_0 00:11:54.518 05:02:14 nvme_xnvme -- xnvme/common.sh@89 -- # prep_nvme 00:11:54.518 05:02:14 nvme_xnvme -- xnvme/common.sh@70 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:11:54.779 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:11:54.779 Waiting for block devices as requested 00:11:55.040 0000:00:11.0 (1b36 0010): uio_pci_generic -> nvme 00:11:55.040 0000:00:10.0 (1b36 0010): uio_pci_generic -> nvme 00:11:55.040 0000:00:12.0 (1b36 0010): uio_pci_generic -> nvme 00:11:55.040 0000:00:13.0 (1b36 0010): uio_pci_generic -> nvme 00:12:00.333 * Events for some block/disk devices (0000:00:13.0) were not caught, they may be missing 00:12:00.333 05:02:20 nvme_xnvme -- xnvme/common.sh@73 -- # modprobe -r nvme 00:12:00.594 05:02:20 nvme_xnvme -- xnvme/common.sh@74 -- # nproc 00:12:00.594 05:02:20 nvme_xnvme -- xnvme/common.sh@74 -- # modprobe nvme poll_queues=10 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/common.sh@77 -- # local nvme 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/common.sh@78 -- # for nvme in /dev/nvme*n!(*p*) 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/common.sh@79 -- # block_in_use /dev/nvme0n1 00:12:00.855 05:02:20 nvme_xnvme -- scripts/common.sh@381 -- # local block=/dev/nvme0n1 pt 00:12:00.855 05:02:20 nvme_xnvme -- scripts/common.sh@390 -- # /home/vagrant/spdk_repo/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:12:00.855 No valid GPT data, bailing 00:12:00.855 05:02:20 nvme_xnvme -- scripts/common.sh@394 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:12:00.855 05:02:20 nvme_xnvme -- scripts/common.sh@394 -- # pt= 00:12:00.855 05:02:20 nvme_xnvme -- scripts/common.sh@395 -- # return 1 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/common.sh@80 -- # xnvme_filename["libaio"]=/dev/nvme0n1 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/common.sh@81 -- # xnvme_filename["io_uring"]=/dev/nvme0n1 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/common.sh@82 -- # xnvme_filename["io_uring_cmd"]=/dev/ng0n1 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/common.sh@83 -- # return 0 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/xnvme.sh@73 -- # trap 'killprocess "$spdk_tgt"' EXIT 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/xnvme.sh@75 -- # for io in "${xnvme_io[@]}" 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/xnvme.sh@76 -- # method_bdev_xnvme_create_0["io_mechanism"]=libaio 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/xnvme.sh@77 -- # method_bdev_xnvme_create_0["filename"]=/dev/nvme0n1 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/xnvme.sh@79 -- # filename=/dev/nvme0n1 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/xnvme.sh@80 -- # name=xnvme_bdev 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/xnvme.sh@82 -- # for cc in "${xnvme_conserve_cpu[@]}" 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/xnvme.sh@83 -- # method_bdev_xnvme_create_0["conserve_cpu"]=false 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/xnvme.sh@84 -- # conserve_cpu=false 00:12:00.855 05:02:20 nvme_xnvme -- xnvme/xnvme.sh@86 -- # run_test xnvme_rpc xnvme_rpc 00:12:00.855 05:02:20 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:12:00.855 05:02:20 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:12:00.855 05:02:20 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:12:00.855 ************************************ 00:12:00.855 START TEST xnvme_rpc 00:12:00.855 ************************************ 00:12:00.855 05:02:20 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@1129 -- # xnvme_rpc 00:12:00.855 05:02:20 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@48 -- # cc=() 00:12:00.855 05:02:20 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@48 -- # local -A cc 00:12:00.855 05:02:20 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@50 -- # cc["false"]= 00:12:00.855 05:02:20 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@50 -- # cc["true"]=-c 00:12:00.856 05:02:20 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@53 -- # spdk_tgt=82571 00:12:00.856 05:02:20 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@54 -- # waitforlisten 82571 00:12:00.856 05:02:20 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@835 -- # '[' -z 82571 ']' 00:12:00.856 05:02:20 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:00.856 05:02:20 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:12:00.856 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:00.856 05:02:20 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:00.856 05:02:20 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:12:00.856 05:02:20 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:00.856 05:02:20 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@52 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:12:01.117 [2024-12-15 05:02:21.009391] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:12:01.117 [2024-12-15 05:02:21.009557] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid82571 ] 00:12:01.117 [2024-12-15 05:02:21.172928] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:01.117 [2024-12-15 05:02:21.202296] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:12:02.060 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:12:02.060 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@868 -- # return 0 00:12:02.060 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@56 -- # rpc_cmd bdev_xnvme_create /dev/nvme0n1 xnvme_bdev libaio '' 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:02.061 xnvme_bdev 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@62 -- # rpc_xnvme name 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.name' 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@62 -- # [[ xnvme_bdev == \x\n\v\m\e\_\b\d\e\v ]] 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@63 -- # rpc_xnvme filename 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.filename' 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@63 -- # [[ /dev/nvme0n1 == \/\d\e\v\/\n\v\m\e\0\n\1 ]] 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@64 -- # rpc_xnvme io_mechanism 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.io_mechanism' 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@64 -- # [[ libaio == \l\i\b\a\i\o ]] 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@65 -- # rpc_xnvme conserve_cpu 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.conserve_cpu' 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@65 -- # [[ false == \f\a\l\s\e ]] 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@67 -- # rpc_cmd bdev_xnvme_delete xnvme_bdev 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:02.061 05:02:21 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:02.061 05:02:22 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:02.061 05:02:22 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@70 -- # killprocess 82571 00:12:02.061 05:02:22 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@954 -- # '[' -z 82571 ']' 00:12:02.061 05:02:22 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@958 -- # kill -0 82571 00:12:02.061 05:02:22 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@959 -- # uname 00:12:02.061 05:02:22 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:12:02.061 05:02:22 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 82571 00:12:02.061 05:02:22 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:12:02.061 05:02:22 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:12:02.061 05:02:22 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 82571' 00:12:02.061 killing process with pid 82571 00:12:02.061 05:02:22 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@973 -- # kill 82571 00:12:02.061 05:02:22 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@978 -- # wait 82571 00:12:02.322 00:12:02.322 real 0m1.400s 00:12:02.322 user 0m1.438s 00:12:02.322 sys 0m0.419s 00:12:02.322 05:02:22 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:12:02.322 05:02:22 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:02.322 ************************************ 00:12:02.322 END TEST xnvme_rpc 00:12:02.322 ************************************ 00:12:02.322 05:02:22 nvme_xnvme -- xnvme/xnvme.sh@87 -- # run_test xnvme_bdevperf xnvme_bdevperf 00:12:02.322 05:02:22 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:12:02.322 05:02:22 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:12:02.322 05:02:22 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:12:02.322 ************************************ 00:12:02.322 START TEST xnvme_bdevperf 00:12:02.322 ************************************ 00:12:02.322 05:02:22 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@1129 -- # xnvme_bdevperf 00:12:02.322 05:02:22 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@12 -- # local io_pattern 00:12:02.322 05:02:22 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@13 -- # local -n io_pattern_ref=libaio 00:12:02.322 05:02:22 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:12:02.322 05:02:22 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w randread -t 5 -T xnvme_bdev -o 4096 00:12:02.322 05:02:22 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:12:02.322 05:02:22 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:12:02.322 05:02:22 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:12:02.322 { 00:12:02.322 "subsystems": [ 00:12:02.322 { 00:12:02.322 "subsystem": "bdev", 00:12:02.322 "config": [ 00:12:02.322 { 00:12:02.322 "params": { 00:12:02.322 "io_mechanism": "libaio", 00:12:02.322 "conserve_cpu": false, 00:12:02.322 "filename": "/dev/nvme0n1", 00:12:02.322 "name": "xnvme_bdev" 00:12:02.322 }, 00:12:02.322 "method": "bdev_xnvme_create" 00:12:02.322 }, 00:12:02.322 { 00:12:02.322 "method": "bdev_wait_for_examine" 00:12:02.322 } 00:12:02.322 ] 00:12:02.322 } 00:12:02.322 ] 00:12:02.322 } 00:12:02.322 [2024-12-15 05:02:22.458502] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:12:02.322 [2024-12-15 05:02:22.458653] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid82629 ] 00:12:02.584 [2024-12-15 05:02:22.621573] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:02.584 [2024-12-15 05:02:22.651572] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:12:02.845 Running I/O for 5 seconds... 00:12:04.733 26227.00 IOPS, 102.45 MiB/s [2024-12-15T05:02:25.826Z] 27058.00 IOPS, 105.70 MiB/s [2024-12-15T05:02:26.796Z] 26338.33 IOPS, 102.88 MiB/s [2024-12-15T05:02:28.182Z] 26089.00 IOPS, 101.91 MiB/s 00:12:08.042 Latency(us) 00:12:08.042 [2024-12-15T05:02:28.182Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:08.042 Job: xnvme_bdev (Core Mask 0x1, workload: randread, depth: 64, IO size: 4096) 00:12:08.042 xnvme_bdev : 5.00 25828.77 100.89 0.00 0.00 2472.73 513.58 8318.03 00:12:08.042 [2024-12-15T05:02:28.182Z] =================================================================================================================== 00:12:08.042 [2024-12-15T05:02:28.182Z] Total : 25828.77 100.89 0.00 0.00 2472.73 513.58 8318.03 00:12:08.042 05:02:27 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:12:08.042 05:02:27 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w randwrite -t 5 -T xnvme_bdev -o 4096 00:12:08.042 05:02:27 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:12:08.042 05:02:27 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:12:08.042 05:02:27 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:12:08.042 { 00:12:08.042 "subsystems": [ 00:12:08.042 { 00:12:08.042 "subsystem": "bdev", 00:12:08.042 "config": [ 00:12:08.042 { 00:12:08.042 "params": { 00:12:08.042 "io_mechanism": "libaio", 00:12:08.042 "conserve_cpu": false, 00:12:08.042 "filename": "/dev/nvme0n1", 00:12:08.042 "name": "xnvme_bdev" 00:12:08.042 }, 00:12:08.042 "method": "bdev_xnvme_create" 00:12:08.042 }, 00:12:08.042 { 00:12:08.043 "method": "bdev_wait_for_examine" 00:12:08.043 } 00:12:08.043 ] 00:12:08.043 } 00:12:08.043 ] 00:12:08.043 } 00:12:08.043 [2024-12-15 05:02:28.067168] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:12:08.043 [2024-12-15 05:02:28.067314] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid82695 ] 00:12:08.303 [2024-12-15 05:02:28.224938] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:08.303 [2024-12-15 05:02:28.253300] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:12:08.303 Running I/O for 5 seconds... 00:12:10.262 32539.00 IOPS, 127.11 MiB/s [2024-12-15T05:02:31.792Z] 32620.50 IOPS, 127.42 MiB/s [2024-12-15T05:02:32.736Z] 32089.33 IOPS, 125.35 MiB/s [2024-12-15T05:02:33.795Z] 32351.25 IOPS, 126.37 MiB/s 00:12:13.655 Latency(us) 00:12:13.655 [2024-12-15T05:02:33.795Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:13.655 Job: xnvme_bdev (Core Mask 0x1, workload: randwrite, depth: 64, IO size: 4096) 00:12:13.655 xnvme_bdev : 5.00 32091.94 125.36 0.00 0.00 1989.48 460.01 8267.62 00:12:13.655 [2024-12-15T05:02:33.795Z] =================================================================================================================== 00:12:13.655 [2024-12-15T05:02:33.795Z] Total : 32091.94 125.36 0.00 0.00 1989.48 460.01 8267.62 00:12:13.655 00:12:13.655 real 0m11.192s 00:12:13.655 user 0m3.420s 00:12:13.655 sys 0m6.365s 00:12:13.655 05:02:33 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:12:13.655 ************************************ 00:12:13.655 END TEST xnvme_bdevperf 00:12:13.655 ************************************ 00:12:13.655 05:02:33 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:12:13.655 05:02:33 nvme_xnvme -- xnvme/xnvme.sh@88 -- # run_test xnvme_fio_plugin xnvme_fio_plugin 00:12:13.655 05:02:33 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:12:13.655 05:02:33 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:12:13.655 05:02:33 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:12:13.655 ************************************ 00:12:13.655 START TEST xnvme_fio_plugin 00:12:13.655 ************************************ 00:12:13.655 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1129 -- # xnvme_fio_plugin 00:12:13.655 05:02:33 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@27 -- # local io_pattern 00:12:13.655 05:02:33 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@28 -- # local -n io_pattern_ref=libaio_fio 00:12:13.655 05:02:33 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@30 -- # for io_pattern in "${io_pattern_ref[@]}" 00:12:13.655 05:02:33 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # fio_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # local sanitizers 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1345 -- # shift 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # gen_conf 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1347 -- # local asan_lib= 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- dd/common.sh@31 -- # xtrace_disable 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # grep libasan 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1351 -- # break 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:12:13.656 05:02:33 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:12:13.656 { 00:12:13.656 "subsystems": [ 00:12:13.656 { 00:12:13.656 "subsystem": "bdev", 00:12:13.656 "config": [ 00:12:13.656 { 00:12:13.656 "params": { 00:12:13.656 "io_mechanism": "libaio", 00:12:13.656 "conserve_cpu": false, 00:12:13.656 "filename": "/dev/nvme0n1", 00:12:13.656 "name": "xnvme_bdev" 00:12:13.656 }, 00:12:13.656 "method": "bdev_xnvme_create" 00:12:13.656 }, 00:12:13.656 { 00:12:13.656 "method": "bdev_wait_for_examine" 00:12:13.656 } 00:12:13.656 ] 00:12:13.656 } 00:12:13.656 ] 00:12:13.656 } 00:12:13.918 xnvme_bdev: (g=0): rw=randread, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=64 00:12:13.918 fio-3.35 00:12:13.918 Starting 1 thread 00:12:19.209 00:12:19.209 xnvme_bdev: (groupid=0, jobs=1): err= 0: pid=82803: Sun Dec 15 05:02:39 2024 00:12:19.209 read: IOPS=30.8k, BW=120MiB/s (126MB/s)(601MiB/5001msec) 00:12:19.209 slat (usec): min=4, max=1961, avg=23.01, stdev=105.97 00:12:19.209 clat (usec): min=108, max=5173, avg=1458.62, stdev=536.74 00:12:19.209 lat (usec): min=190, max=5245, avg=1481.63, stdev=525.21 00:12:19.209 clat percentiles (usec): 00:12:19.209 | 1.00th=[ 302], 5.00th=[ 611], 10.00th=[ 783], 20.00th=[ 1037], 00:12:19.209 | 30.00th=[ 1205], 40.00th=[ 1336], 50.00th=[ 1450], 60.00th=[ 1565], 00:12:19.209 | 70.00th=[ 1696], 80.00th=[ 1827], 90.00th=[ 2073], 95.00th=[ 2311], 00:12:19.209 | 99.00th=[ 3032], 99.50th=[ 3490], 99.90th=[ 4080], 99.95th=[ 4293], 00:12:19.209 | 99.99th=[ 4752] 00:12:19.209 bw ( KiB/s): min=113832, max=137808, per=100.00%, avg=123267.56, stdev=7179.03, samples=9 00:12:19.209 iops : min=28458, max=34452, avg=30816.89, stdev=1794.76, samples=9 00:12:19.209 lat (usec) : 250=0.54%, 500=2.53%, 750=5.95%, 1000=9.35% 00:12:19.209 lat (msec) : 2=69.54%, 4=11.96%, 10=0.13% 00:12:19.209 cpu : usr=41.50%, sys=50.46%, ctx=13, majf=0, minf=773 00:12:19.209 IO depths : 1=0.5%, 2=1.3%, 4=3.1%, 8=8.2%, 16=22.6%, 32=62.3%, >=64=2.1% 00:12:19.209 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:12:19.209 complete : 0=0.0%, 4=98.0%, 8=0.1%, 16=0.1%, 32=0.3%, 64=1.6%, >=64=0.0% 00:12:19.209 issued rwts: total=153857,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:12:19.209 latency : target=0, window=0, percentile=100.00%, depth=64 00:12:19.209 00:12:19.209 Run status group 0 (all jobs): 00:12:19.209 READ: bw=120MiB/s (126MB/s), 120MiB/s-120MiB/s (126MB/s-126MB/s), io=601MiB (630MB), run=5001-5001msec 00:12:19.781 ----------------------------------------------------- 00:12:19.781 Suppressions used: 00:12:19.781 count bytes template 00:12:19.781 1 11 /usr/src/fio/parse.c 00:12:19.781 1 8 libtcmalloc_minimal.so 00:12:19.781 1 904 libcrypto.so 00:12:19.781 ----------------------------------------------------- 00:12:19.781 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@30 -- # for io_pattern in "${io_pattern_ref[@]}" 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # fio_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # local sanitizers 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # gen_conf 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1345 -- # shift 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1347 -- # local asan_lib= 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- dd/common.sh@31 -- # xtrace_disable 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # grep libasan 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1351 -- # break 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:12:19.781 05:02:39 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:12:19.781 { 00:12:19.781 "subsystems": [ 00:12:19.781 { 00:12:19.781 "subsystem": "bdev", 00:12:19.781 "config": [ 00:12:19.781 { 00:12:19.781 "params": { 00:12:19.781 "io_mechanism": "libaio", 00:12:19.781 "conserve_cpu": false, 00:12:19.781 "filename": "/dev/nvme0n1", 00:12:19.781 "name": "xnvme_bdev" 00:12:19.781 }, 00:12:19.781 "method": "bdev_xnvme_create" 00:12:19.781 }, 00:12:19.781 { 00:12:19.781 "method": "bdev_wait_for_examine" 00:12:19.781 } 00:12:19.781 ] 00:12:19.781 } 00:12:19.781 ] 00:12:19.781 } 00:12:19.781 xnvme_bdev: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=64 00:12:19.781 fio-3.35 00:12:19.781 Starting 1 thread 00:12:26.376 00:12:26.376 xnvme_bdev: (groupid=0, jobs=1): err= 0: pid=82889: Sun Dec 15 05:02:45 2024 00:12:26.376 write: IOPS=33.3k, BW=130MiB/s (136MB/s)(651MiB/5001msec); 0 zone resets 00:12:26.376 slat (usec): min=4, max=1862, avg=22.47, stdev=93.28 00:12:26.376 clat (usec): min=106, max=7922, avg=1309.29, stdev=555.76 00:12:26.376 lat (usec): min=186, max=7927, avg=1331.77, stdev=548.40 00:12:26.376 clat percentiles (usec): 00:12:26.376 | 1.00th=[ 281], 5.00th=[ 519], 10.00th=[ 660], 20.00th=[ 857], 00:12:26.376 | 30.00th=[ 1004], 40.00th=[ 1139], 50.00th=[ 1270], 60.00th=[ 1401], 00:12:26.376 | 70.00th=[ 1532], 80.00th=[ 1696], 90.00th=[ 1958], 95.00th=[ 2245], 00:12:26.376 | 99.00th=[ 3032], 99.50th=[ 3425], 99.90th=[ 4424], 99.95th=[ 4817], 00:12:26.376 | 99.99th=[ 7832] 00:12:26.376 bw ( KiB/s): min=118267, max=146416, per=98.53%, avg=131316.78, stdev=9473.13, samples=9 00:12:26.376 iops : min=29566, max=36604, avg=32829.11, stdev=2368.41, samples=9 00:12:26.376 lat (usec) : 250=0.66%, 500=3.92%, 750=9.60%, 1000=15.61% 00:12:26.376 lat (msec) : 2=61.13%, 4=8.90%, 10=0.18% 00:12:26.376 cpu : usr=39.80%, sys=50.30%, ctx=12, majf=0, minf=774 00:12:26.376 IO depths : 1=0.4%, 2=1.0%, 4=2.7%, 8=7.8%, 16=22.7%, 32=63.3%, >=64=2.1% 00:12:26.376 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:12:26.376 complete : 0=0.0%, 4=97.9%, 8=0.1%, 16=0.1%, 32=0.3%, 64=1.7%, >=64=0.0% 00:12:26.376 issued rwts: total=0,166634,0,0 short=0,0,0,0 dropped=0,0,0,0 00:12:26.376 latency : target=0, window=0, percentile=100.00%, depth=64 00:12:26.376 00:12:26.376 Run status group 0 (all jobs): 00:12:26.376 WRITE: bw=130MiB/s (136MB/s), 130MiB/s-130MiB/s (136MB/s-136MB/s), io=651MiB (683MB), run=5001-5001msec 00:12:26.376 ----------------------------------------------------- 00:12:26.376 Suppressions used: 00:12:26.376 count bytes template 00:12:26.376 1 11 /usr/src/fio/parse.c 00:12:26.376 1 8 libtcmalloc_minimal.so 00:12:26.376 1 904 libcrypto.so 00:12:26.376 ----------------------------------------------------- 00:12:26.376 00:12:26.376 00:12:26.376 real 0m12.114s 00:12:26.376 user 0m5.216s 00:12:26.376 sys 0m5.609s 00:12:26.376 05:02:45 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1130 -- # xtrace_disable 00:12:26.376 05:02:45 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:12:26.376 ************************************ 00:12:26.376 END TEST xnvme_fio_plugin 00:12:26.376 ************************************ 00:12:26.376 05:02:45 nvme_xnvme -- xnvme/xnvme.sh@82 -- # for cc in "${xnvme_conserve_cpu[@]}" 00:12:26.376 05:02:45 nvme_xnvme -- xnvme/xnvme.sh@83 -- # method_bdev_xnvme_create_0["conserve_cpu"]=true 00:12:26.376 05:02:45 nvme_xnvme -- xnvme/xnvme.sh@84 -- # conserve_cpu=true 00:12:26.376 05:02:45 nvme_xnvme -- xnvme/xnvme.sh@86 -- # run_test xnvme_rpc xnvme_rpc 00:12:26.376 05:02:45 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:12:26.376 05:02:45 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:12:26.376 05:02:45 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:12:26.376 ************************************ 00:12:26.376 START TEST xnvme_rpc 00:12:26.376 ************************************ 00:12:26.376 05:02:45 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@1129 -- # xnvme_rpc 00:12:26.376 05:02:45 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@48 -- # cc=() 00:12:26.376 05:02:45 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@48 -- # local -A cc 00:12:26.376 05:02:45 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@50 -- # cc["false"]= 00:12:26.376 05:02:45 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@50 -- # cc["true"]=-c 00:12:26.376 05:02:45 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@53 -- # spdk_tgt=82965 00:12:26.376 05:02:45 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@54 -- # waitforlisten 82965 00:12:26.376 05:02:45 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@835 -- # '[' -z 82965 ']' 00:12:26.376 05:02:45 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:26.376 05:02:45 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@52 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:12:26.376 05:02:45 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:12:26.376 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:26.376 05:02:45 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:26.376 05:02:45 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:12:26.377 05:02:45 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:26.377 [2024-12-15 05:02:45.909523] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:12:26.377 [2024-12-15 05:02:45.909665] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid82965 ] 00:12:26.377 [2024-12-15 05:02:46.072861] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:26.377 [2024-12-15 05:02:46.102036] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:12:26.638 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:12:26.638 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@868 -- # return 0 00:12:26.638 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@56 -- # rpc_cmd bdev_xnvme_create /dev/nvme0n1 xnvme_bdev libaio -c 00:12:26.638 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:26.638 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:26.899 xnvme_bdev 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@62 -- # rpc_xnvme name 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.name' 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@62 -- # [[ xnvme_bdev == \x\n\v\m\e\_\b\d\e\v ]] 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@63 -- # rpc_xnvme filename 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.filename' 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@63 -- # [[ /dev/nvme0n1 == \/\d\e\v\/\n\v\m\e\0\n\1 ]] 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@64 -- # rpc_xnvme io_mechanism 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.io_mechanism' 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@64 -- # [[ libaio == \l\i\b\a\i\o ]] 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@65 -- # rpc_xnvme conserve_cpu 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.conserve_cpu' 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@65 -- # [[ true == \t\r\u\e ]] 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@67 -- # rpc_cmd bdev_xnvme_delete xnvme_bdev 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@70 -- # killprocess 82965 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@954 -- # '[' -z 82965 ']' 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@958 -- # kill -0 82965 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@959 -- # uname 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 82965 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:12:26.899 killing process with pid 82965 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 82965' 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@973 -- # kill 82965 00:12:26.899 05:02:46 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@978 -- # wait 82965 00:12:27.159 ************************************ 00:12:27.159 END TEST xnvme_rpc 00:12:27.159 ************************************ 00:12:27.159 00:12:27.159 real 0m1.439s 00:12:27.159 user 0m1.462s 00:12:27.159 sys 0m0.451s 00:12:27.159 05:02:47 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:12:27.159 05:02:47 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:27.420 05:02:47 nvme_xnvme -- xnvme/xnvme.sh@87 -- # run_test xnvme_bdevperf xnvme_bdevperf 00:12:27.420 05:02:47 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:12:27.420 05:02:47 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:12:27.420 05:02:47 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:12:27.420 ************************************ 00:12:27.420 START TEST xnvme_bdevperf 00:12:27.420 ************************************ 00:12:27.420 05:02:47 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@1129 -- # xnvme_bdevperf 00:12:27.420 05:02:47 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@12 -- # local io_pattern 00:12:27.420 05:02:47 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@13 -- # local -n io_pattern_ref=libaio 00:12:27.420 05:02:47 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:12:27.420 05:02:47 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w randread -t 5 -T xnvme_bdev -o 4096 00:12:27.420 05:02:47 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:12:27.420 05:02:47 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:12:27.420 05:02:47 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:12:27.420 { 00:12:27.420 "subsystems": [ 00:12:27.420 { 00:12:27.420 "subsystem": "bdev", 00:12:27.420 "config": [ 00:12:27.420 { 00:12:27.420 "params": { 00:12:27.420 "io_mechanism": "libaio", 00:12:27.420 "conserve_cpu": true, 00:12:27.420 "filename": "/dev/nvme0n1", 00:12:27.420 "name": "xnvme_bdev" 00:12:27.420 }, 00:12:27.420 "method": "bdev_xnvme_create" 00:12:27.420 }, 00:12:27.420 { 00:12:27.420 "method": "bdev_wait_for_examine" 00:12:27.420 } 00:12:27.420 ] 00:12:27.420 } 00:12:27.420 ] 00:12:27.420 } 00:12:27.420 [2024-12-15 05:02:47.406611] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:12:27.420 [2024-12-15 05:02:47.406960] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid83023 ] 00:12:27.681 [2024-12-15 05:02:47.570106] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:27.681 [2024-12-15 05:02:47.599047] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:12:27.681 Running I/O for 5 seconds... 00:12:30.011 27720.00 IOPS, 108.28 MiB/s [2024-12-15T05:02:50.743Z] 29404.00 IOPS, 114.86 MiB/s [2024-12-15T05:02:52.129Z] 29177.00 IOPS, 113.97 MiB/s [2024-12-15T05:02:53.073Z] 29534.25 IOPS, 115.37 MiB/s 00:12:32.933 Latency(us) 00:12:32.933 [2024-12-15T05:02:53.073Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:32.933 Job: xnvme_bdev (Core Mask 0x1, workload: randread, depth: 64, IO size: 4096) 00:12:32.933 xnvme_bdev : 5.00 29672.91 115.91 0.00 0.00 2151.91 288.30 10637.00 00:12:32.933 [2024-12-15T05:02:53.073Z] =================================================================================================================== 00:12:32.933 [2024-12-15T05:02:53.073Z] Total : 29672.91 115.91 0.00 0.00 2151.91 288.30 10637.00 00:12:32.933 05:02:52 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:12:32.933 05:02:52 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w randwrite -t 5 -T xnvme_bdev -o 4096 00:12:32.933 05:02:52 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:12:32.933 05:02:52 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:12:32.933 05:02:52 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:12:32.933 { 00:12:32.933 "subsystems": [ 00:12:32.933 { 00:12:32.933 "subsystem": "bdev", 00:12:32.933 "config": [ 00:12:32.933 { 00:12:32.933 "params": { 00:12:32.933 "io_mechanism": "libaio", 00:12:32.933 "conserve_cpu": true, 00:12:32.933 "filename": "/dev/nvme0n1", 00:12:32.933 "name": "xnvme_bdev" 00:12:32.933 }, 00:12:32.933 "method": "bdev_xnvme_create" 00:12:32.933 }, 00:12:32.933 { 00:12:32.933 "method": "bdev_wait_for_examine" 00:12:32.933 } 00:12:32.933 ] 00:12:32.933 } 00:12:32.933 ] 00:12:32.933 } 00:12:32.933 [2024-12-15 05:02:52.996649] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:12:32.933 [2024-12-15 05:02:52.996794] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid83087 ] 00:12:33.194 [2024-12-15 05:02:53.157349] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:33.195 [2024-12-15 05:02:53.187311] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:12:33.195 Running I/O for 5 seconds... 00:12:35.531 34338.00 IOPS, 134.13 MiB/s [2024-12-15T05:02:56.674Z] 34612.00 IOPS, 135.20 MiB/s [2024-12-15T05:02:57.618Z] 34741.33 IOPS, 135.71 MiB/s [2024-12-15T05:02:58.562Z] 35073.00 IOPS, 137.00 MiB/s 00:12:38.422 Latency(us) 00:12:38.422 [2024-12-15T05:02:58.562Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:38.422 Job: xnvme_bdev (Core Mask 0x1, workload: randwrite, depth: 64, IO size: 4096) 00:12:38.422 xnvme_bdev : 5.00 34912.53 136.38 0.00 0.00 1828.65 456.86 5520.15 00:12:38.422 [2024-12-15T05:02:58.562Z] =================================================================================================================== 00:12:38.422 [2024-12-15T05:02:58.562Z] Total : 34912.53 136.38 0.00 0.00 1828.65 456.86 5520.15 00:12:38.422 ************************************ 00:12:38.422 END TEST xnvme_bdevperf 00:12:38.422 ************************************ 00:12:38.422 00:12:38.422 real 0m11.173s 00:12:38.422 user 0m3.267s 00:12:38.422 sys 0m6.315s 00:12:38.422 05:02:58 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:12:38.422 05:02:58 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:12:38.684 05:02:58 nvme_xnvme -- xnvme/xnvme.sh@88 -- # run_test xnvme_fio_plugin xnvme_fio_plugin 00:12:38.684 05:02:58 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:12:38.684 05:02:58 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:12:38.684 05:02:58 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:12:38.684 ************************************ 00:12:38.684 START TEST xnvme_fio_plugin 00:12:38.684 ************************************ 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1129 -- # xnvme_fio_plugin 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@27 -- # local io_pattern 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@28 -- # local -n io_pattern_ref=libaio_fio 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@30 -- # for io_pattern in "${io_pattern_ref[@]}" 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # fio_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # local sanitizers 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # gen_conf 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1345 -- # shift 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1347 -- # local asan_lib= 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- dd/common.sh@31 -- # xtrace_disable 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # grep libasan 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1351 -- # break 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:12:38.684 05:02:58 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:12:38.684 { 00:12:38.684 "subsystems": [ 00:12:38.684 { 00:12:38.684 "subsystem": "bdev", 00:12:38.684 "config": [ 00:12:38.684 { 00:12:38.684 "params": { 00:12:38.684 "io_mechanism": "libaio", 00:12:38.684 "conserve_cpu": true, 00:12:38.684 "filename": "/dev/nvme0n1", 00:12:38.684 "name": "xnvme_bdev" 00:12:38.684 }, 00:12:38.684 "method": "bdev_xnvme_create" 00:12:38.684 }, 00:12:38.684 { 00:12:38.684 "method": "bdev_wait_for_examine" 00:12:38.684 } 00:12:38.684 ] 00:12:38.684 } 00:12:38.684 ] 00:12:38.684 } 00:12:38.684 xnvme_bdev: (g=0): rw=randread, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=64 00:12:38.684 fio-3.35 00:12:38.684 Starting 1 thread 00:12:45.271 00:12:45.271 xnvme_bdev: (groupid=0, jobs=1): err= 0: pid=83195: Sun Dec 15 05:03:04 2024 00:12:45.271 read: IOPS=32.0k, BW=125MiB/s (131MB/s)(625MiB/5001msec) 00:12:45.271 slat (usec): min=4, max=1993, avg=23.97, stdev=100.05 00:12:45.271 clat (usec): min=105, max=4870, avg=1352.51, stdev=548.28 00:12:45.271 lat (usec): min=182, max=5073, avg=1376.48, stdev=539.00 00:12:45.271 clat percentiles (usec): 00:12:45.271 | 1.00th=[ 273], 5.00th=[ 529], 10.00th=[ 693], 20.00th=[ 914], 00:12:45.271 | 30.00th=[ 1074], 40.00th=[ 1205], 50.00th=[ 1319], 60.00th=[ 1450], 00:12:45.271 | 70.00th=[ 1582], 80.00th=[ 1745], 90.00th=[ 2008], 95.00th=[ 2278], 00:12:45.271 | 99.00th=[ 3064], 99.50th=[ 3425], 99.90th=[ 4080], 99.95th=[ 4293], 00:12:45.271 | 99.99th=[ 4621] 00:12:45.271 bw ( KiB/s): min=119392, max=138592, per=100.00%, avg=129496.00, stdev=7011.98, samples=9 00:12:45.271 iops : min=29848, max=34648, avg=32374.00, stdev=1753.00, samples=9 00:12:45.271 lat (usec) : 250=0.75%, 500=3.65%, 750=7.78%, 1000=13.39% 00:12:45.271 lat (msec) : 2=64.18%, 4=10.13%, 10=0.13% 00:12:45.271 cpu : usr=35.58%, sys=55.86%, ctx=10, majf=0, minf=773 00:12:45.271 IO depths : 1=0.4%, 2=1.1%, 4=2.8%, 8=8.1%, 16=23.5%, 32=62.0%, >=64=2.1% 00:12:45.271 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:12:45.271 complete : 0=0.0%, 4=98.0%, 8=0.1%, 16=0.1%, 32=0.3%, 64=1.7%, >=64=0.0% 00:12:45.271 issued rwts: total=160012,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:12:45.271 latency : target=0, window=0, percentile=100.00%, depth=64 00:12:45.271 00:12:45.271 Run status group 0 (all jobs): 00:12:45.271 READ: bw=125MiB/s (131MB/s), 125MiB/s-125MiB/s (131MB/s-131MB/s), io=625MiB (655MB), run=5001-5001msec 00:12:45.271 ----------------------------------------------------- 00:12:45.271 Suppressions used: 00:12:45.271 count bytes template 00:12:45.271 1 11 /usr/src/fio/parse.c 00:12:45.271 1 8 libtcmalloc_minimal.so 00:12:45.271 1 904 libcrypto.so 00:12:45.271 ----------------------------------------------------- 00:12:45.271 00:12:45.271 05:03:04 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@30 -- # for io_pattern in "${io_pattern_ref[@]}" 00:12:45.271 05:03:04 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # fio_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:12:45.271 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:12:45.271 05:03:04 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # gen_conf 00:12:45.271 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:12:45.271 05:03:04 nvme_xnvme.xnvme_fio_plugin -- dd/common.sh@31 -- # xtrace_disable 00:12:45.271 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:12:45.271 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # local sanitizers 00:12:45.271 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:12:45.271 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:12:45.271 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1345 -- # shift 00:12:45.271 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1347 -- # local asan_lib= 00:12:45.271 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:12:45.272 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:12:45.272 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # grep libasan 00:12:45.272 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:12:45.272 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:12:45.272 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:12:45.272 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1351 -- # break 00:12:45.272 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:12:45.272 05:03:04 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:12:45.272 { 00:12:45.272 "subsystems": [ 00:12:45.272 { 00:12:45.272 "subsystem": "bdev", 00:12:45.272 "config": [ 00:12:45.272 { 00:12:45.272 "params": { 00:12:45.272 "io_mechanism": "libaio", 00:12:45.272 "conserve_cpu": true, 00:12:45.272 "filename": "/dev/nvme0n1", 00:12:45.272 "name": "xnvme_bdev" 00:12:45.272 }, 00:12:45.272 "method": "bdev_xnvme_create" 00:12:45.272 }, 00:12:45.272 { 00:12:45.272 "method": "bdev_wait_for_examine" 00:12:45.272 } 00:12:45.272 ] 00:12:45.272 } 00:12:45.272 ] 00:12:45.272 } 00:12:45.272 xnvme_bdev: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=64 00:12:45.272 fio-3.35 00:12:45.272 Starting 1 thread 00:12:50.560 00:12:50.560 xnvme_bdev: (groupid=0, jobs=1): err= 0: pid=83281: Sun Dec 15 05:03:10 2024 00:12:50.560 write: IOPS=34.1k, BW=133MiB/s (140MB/s)(666MiB/5001msec); 0 zone resets 00:12:50.560 slat (usec): min=4, max=2099, avg=21.99, stdev=89.91 00:12:50.560 clat (usec): min=106, max=5838, avg=1280.73, stdev=537.92 00:12:50.560 lat (usec): min=198, max=5851, avg=1302.72, stdev=530.62 00:12:50.560 clat percentiles (usec): 00:12:50.560 | 1.00th=[ 269], 5.00th=[ 465], 10.00th=[ 627], 20.00th=[ 824], 00:12:50.560 | 30.00th=[ 979], 40.00th=[ 1123], 50.00th=[ 1254], 60.00th=[ 1385], 00:12:50.560 | 70.00th=[ 1516], 80.00th=[ 1680], 90.00th=[ 1942], 95.00th=[ 2180], 00:12:50.560 | 99.00th=[ 2835], 99.50th=[ 3228], 99.90th=[ 3851], 99.95th=[ 4113], 00:12:50.560 | 99.99th=[ 5538] 00:12:50.560 bw ( KiB/s): min=126416, max=145528, per=99.49%, avg=135778.67, stdev=6595.94, samples=9 00:12:50.560 iops : min=31604, max=36382, avg=33944.67, stdev=1649.24, samples=9 00:12:50.560 lat (usec) : 250=0.76%, 500=5.15%, 750=9.98%, 1000=15.33% 00:12:50.560 lat (msec) : 2=60.42%, 4=8.31%, 10=0.06% 00:12:50.560 cpu : usr=38.80%, sys=51.98%, ctx=10, majf=0, minf=774 00:12:50.561 IO depths : 1=0.4%, 2=1.1%, 4=2.9%, 8=8.4%, 16=23.5%, 32=61.6%, >=64=2.1% 00:12:50.561 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:12:50.561 complete : 0=0.0%, 4=98.0%, 8=0.1%, 16=0.1%, 32=0.2%, 64=1.7%, >=64=0.0% 00:12:50.561 issued rwts: total=0,170620,0,0 short=0,0,0,0 dropped=0,0,0,0 00:12:50.561 latency : target=0, window=0, percentile=100.00%, depth=64 00:12:50.561 00:12:50.561 Run status group 0 (all jobs): 00:12:50.561 WRITE: bw=133MiB/s (140MB/s), 133MiB/s-133MiB/s (140MB/s-140MB/s), io=666MiB (699MB), run=5001-5001msec 00:12:50.561 ----------------------------------------------------- 00:12:50.561 Suppressions used: 00:12:50.561 count bytes template 00:12:50.561 1 11 /usr/src/fio/parse.c 00:12:50.561 1 8 libtcmalloc_minimal.so 00:12:50.561 1 904 libcrypto.so 00:12:50.561 ----------------------------------------------------- 00:12:50.561 00:12:50.820 ************************************ 00:12:50.820 END TEST xnvme_fio_plugin 00:12:50.821 ************************************ 00:12:50.821 00:12:50.821 real 0m12.138s 00:12:50.821 user 0m4.865s 00:12:50.821 sys 0m6.004s 00:12:50.821 05:03:10 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1130 -- # xtrace_disable 00:12:50.821 05:03:10 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:12:50.821 05:03:10 nvme_xnvme -- xnvme/xnvme.sh@75 -- # for io in "${xnvme_io[@]}" 00:12:50.821 05:03:10 nvme_xnvme -- xnvme/xnvme.sh@76 -- # method_bdev_xnvme_create_0["io_mechanism"]=io_uring 00:12:50.821 05:03:10 nvme_xnvme -- xnvme/xnvme.sh@77 -- # method_bdev_xnvme_create_0["filename"]=/dev/nvme0n1 00:12:50.821 05:03:10 nvme_xnvme -- xnvme/xnvme.sh@79 -- # filename=/dev/nvme0n1 00:12:50.821 05:03:10 nvme_xnvme -- xnvme/xnvme.sh@80 -- # name=xnvme_bdev 00:12:50.821 05:03:10 nvme_xnvme -- xnvme/xnvme.sh@82 -- # for cc in "${xnvme_conserve_cpu[@]}" 00:12:50.821 05:03:10 nvme_xnvme -- xnvme/xnvme.sh@83 -- # method_bdev_xnvme_create_0["conserve_cpu"]=false 00:12:50.821 05:03:10 nvme_xnvme -- xnvme/xnvme.sh@84 -- # conserve_cpu=false 00:12:50.821 05:03:10 nvme_xnvme -- xnvme/xnvme.sh@86 -- # run_test xnvme_rpc xnvme_rpc 00:12:50.821 05:03:10 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:12:50.821 05:03:10 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:12:50.821 05:03:10 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:12:50.821 ************************************ 00:12:50.821 START TEST xnvme_rpc 00:12:50.821 ************************************ 00:12:50.821 05:03:10 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@1129 -- # xnvme_rpc 00:12:50.821 05:03:10 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@48 -- # cc=() 00:12:50.821 05:03:10 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@48 -- # local -A cc 00:12:50.821 05:03:10 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@50 -- # cc["false"]= 00:12:50.821 05:03:10 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@50 -- # cc["true"]=-c 00:12:50.821 05:03:10 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@53 -- # spdk_tgt=83364 00:12:50.821 05:03:10 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@54 -- # waitforlisten 83364 00:12:50.821 05:03:10 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@835 -- # '[' -z 83364 ']' 00:12:50.821 05:03:10 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:12:50.821 05:03:10 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:12:50.821 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:12:50.821 05:03:10 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:12:50.821 05:03:10 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:12:50.821 05:03:10 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:50.821 05:03:10 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@52 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:12:50.821 [2024-12-15 05:03:10.884586] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:12:50.821 [2024-12-15 05:03:10.884742] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid83364 ] 00:12:51.081 [2024-12-15 05:03:11.047069] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:51.081 [2024-12-15 05:03:11.076913] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:12:51.652 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:12:51.652 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@868 -- # return 0 00:12:51.652 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@56 -- # rpc_cmd bdev_xnvme_create /dev/nvme0n1 xnvme_bdev io_uring '' 00:12:51.652 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:51.652 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:51.652 xnvme_bdev 00:12:51.652 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:51.652 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@62 -- # rpc_xnvme name 00:12:51.652 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:12:51.652 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:51.652 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:51.652 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.name' 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@62 -- # [[ xnvme_bdev == \x\n\v\m\e\_\b\d\e\v ]] 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@63 -- # rpc_xnvme filename 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.filename' 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@63 -- # [[ /dev/nvme0n1 == \/\d\e\v\/\n\v\m\e\0\n\1 ]] 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@64 -- # rpc_xnvme io_mechanism 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.io_mechanism' 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@64 -- # [[ io_uring == \i\o\_\u\r\i\n\g ]] 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@65 -- # rpc_xnvme conserve_cpu 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.conserve_cpu' 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@65 -- # [[ false == \f\a\l\s\e ]] 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@67 -- # rpc_cmd bdev_xnvme_delete xnvme_bdev 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@70 -- # killprocess 83364 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@954 -- # '[' -z 83364 ']' 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@958 -- # kill -0 83364 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@959 -- # uname 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 83364 00:12:51.913 killing process with pid 83364 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 83364' 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@973 -- # kill 83364 00:12:51.913 05:03:11 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@978 -- # wait 83364 00:12:52.174 00:12:52.174 real 0m1.467s 00:12:52.174 user 0m1.540s 00:12:52.174 sys 0m0.429s 00:12:52.174 05:03:12 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:12:52.174 05:03:12 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:12:52.174 ************************************ 00:12:52.174 END TEST xnvme_rpc 00:12:52.174 ************************************ 00:12:52.435 05:03:12 nvme_xnvme -- xnvme/xnvme.sh@87 -- # run_test xnvme_bdevperf xnvme_bdevperf 00:12:52.435 05:03:12 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:12:52.435 05:03:12 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:12:52.435 05:03:12 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:12:52.435 ************************************ 00:12:52.436 START TEST xnvme_bdevperf 00:12:52.436 ************************************ 00:12:52.436 05:03:12 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@1129 -- # xnvme_bdevperf 00:12:52.436 05:03:12 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@12 -- # local io_pattern 00:12:52.436 05:03:12 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@13 -- # local -n io_pattern_ref=io_uring 00:12:52.436 05:03:12 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:12:52.436 05:03:12 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w randread -t 5 -T xnvme_bdev -o 4096 00:12:52.436 05:03:12 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:12:52.436 05:03:12 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:12:52.436 05:03:12 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:12:52.436 { 00:12:52.436 "subsystems": [ 00:12:52.436 { 00:12:52.436 "subsystem": "bdev", 00:12:52.436 "config": [ 00:12:52.436 { 00:12:52.436 "params": { 00:12:52.436 "io_mechanism": "io_uring", 00:12:52.436 "conserve_cpu": false, 00:12:52.436 "filename": "/dev/nvme0n1", 00:12:52.436 "name": "xnvme_bdev" 00:12:52.436 }, 00:12:52.436 "method": "bdev_xnvme_create" 00:12:52.436 }, 00:12:52.436 { 00:12:52.436 "method": "bdev_wait_for_examine" 00:12:52.436 } 00:12:52.436 ] 00:12:52.436 } 00:12:52.436 ] 00:12:52.436 } 00:12:52.436 [2024-12-15 05:03:12.409451] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:12:52.436 [2024-12-15 05:03:12.409594] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid83416 ] 00:12:52.436 [2024-12-15 05:03:12.571320] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:52.697 [2024-12-15 05:03:12.602380] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:12:52.697 Running I/O for 5 seconds... 00:12:54.584 32501.00 IOPS, 126.96 MiB/s [2024-12-15T05:03:16.110Z] 32540.00 IOPS, 127.11 MiB/s [2024-12-15T05:03:17.054Z] 32739.67 IOPS, 127.89 MiB/s [2024-12-15T05:03:17.996Z] 32805.75 IOPS, 128.15 MiB/s [2024-12-15T05:03:17.996Z] 32807.00 IOPS, 128.15 MiB/s 00:12:57.856 Latency(us) 00:12:57.856 [2024-12-15T05:03:17.996Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:57.856 Job: xnvme_bdev (Core Mask 0x1, workload: randread, depth: 64, IO size: 4096) 00:12:57.856 xnvme_bdev : 5.00 32796.15 128.11 0.00 0.00 1947.61 371.79 10384.94 00:12:57.856 [2024-12-15T05:03:17.996Z] =================================================================================================================== 00:12:57.856 [2024-12-15T05:03:17.996Z] Total : 32796.15 128.11 0.00 0.00 1947.61 371.79 10384.94 00:12:57.856 05:03:17 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:12:57.856 05:03:17 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w randwrite -t 5 -T xnvme_bdev -o 4096 00:12:57.856 05:03:17 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:12:57.856 05:03:17 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:12:57.856 05:03:17 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:12:57.856 { 00:12:57.856 "subsystems": [ 00:12:57.856 { 00:12:57.856 "subsystem": "bdev", 00:12:57.856 "config": [ 00:12:57.856 { 00:12:57.856 "params": { 00:12:57.856 "io_mechanism": "io_uring", 00:12:57.856 "conserve_cpu": false, 00:12:57.856 "filename": "/dev/nvme0n1", 00:12:57.856 "name": "xnvme_bdev" 00:12:57.856 }, 00:12:57.856 "method": "bdev_xnvme_create" 00:12:57.856 }, 00:12:57.856 { 00:12:57.856 "method": "bdev_wait_for_examine" 00:12:57.856 } 00:12:57.856 ] 00:12:57.856 } 00:12:57.856 ] 00:12:57.856 } 00:12:57.856 [2024-12-15 05:03:17.964703] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:12:57.856 [2024-12-15 05:03:17.964853] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid83487 ] 00:12:58.117 [2024-12-15 05:03:18.126295] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:58.117 [2024-12-15 05:03:18.154539] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:12:58.117 Running I/O for 5 seconds... 00:13:00.448 34078.00 IOPS, 133.12 MiB/s [2024-12-15T05:03:21.531Z] 33841.50 IOPS, 132.19 MiB/s [2024-12-15T05:03:22.476Z] 34073.00 IOPS, 133.10 MiB/s [2024-12-15T05:03:23.420Z] 33908.00 IOPS, 132.45 MiB/s [2024-12-15T05:03:23.420Z] 33892.00 IOPS, 132.39 MiB/s 00:13:03.280 Latency(us) 00:13:03.280 [2024-12-15T05:03:23.420Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:03.280 Job: xnvme_bdev (Core Mask 0x1, workload: randwrite, depth: 64, IO size: 4096) 00:13:03.280 xnvme_bdev : 5.00 33872.16 132.31 0.00 0.00 1885.41 378.09 7309.78 00:13:03.280 [2024-12-15T05:03:23.420Z] =================================================================================================================== 00:13:03.280 [2024-12-15T05:03:23.420Z] Total : 33872.16 132.31 0.00 0.00 1885.41 378.09 7309.78 00:13:03.571 00:13:03.571 real 0m11.103s 00:13:03.571 user 0m4.297s 00:13:03.571 sys 0m6.541s 00:13:03.571 05:03:23 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:13:03.571 ************************************ 00:13:03.571 END TEST xnvme_bdevperf 00:13:03.571 ************************************ 00:13:03.571 05:03:23 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:13:03.571 05:03:23 nvme_xnvme -- xnvme/xnvme.sh@88 -- # run_test xnvme_fio_plugin xnvme_fio_plugin 00:13:03.571 05:03:23 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:13:03.571 05:03:23 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:13:03.571 05:03:23 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:13:03.571 ************************************ 00:13:03.571 START TEST xnvme_fio_plugin 00:13:03.571 ************************************ 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1129 -- # xnvme_fio_plugin 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@27 -- # local io_pattern 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@28 -- # local -n io_pattern_ref=io_uring_fio 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@30 -- # for io_pattern in "${io_pattern_ref[@]}" 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # fio_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # local sanitizers 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # gen_conf 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1345 -- # shift 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1347 -- # local asan_lib= 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- dd/common.sh@31 -- # xtrace_disable 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # grep libasan 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1351 -- # break 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:13:03.571 05:03:23 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:13:03.571 { 00:13:03.571 "subsystems": [ 00:13:03.571 { 00:13:03.571 "subsystem": "bdev", 00:13:03.571 "config": [ 00:13:03.571 { 00:13:03.571 "params": { 00:13:03.571 "io_mechanism": "io_uring", 00:13:03.571 "conserve_cpu": false, 00:13:03.571 "filename": "/dev/nvme0n1", 00:13:03.571 "name": "xnvme_bdev" 00:13:03.571 }, 00:13:03.571 "method": "bdev_xnvme_create" 00:13:03.571 }, 00:13:03.571 { 00:13:03.571 "method": "bdev_wait_for_examine" 00:13:03.571 } 00:13:03.571 ] 00:13:03.571 } 00:13:03.571 ] 00:13:03.571 } 00:13:03.855 xnvme_bdev: (g=0): rw=randread, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=64 00:13:03.855 fio-3.35 00:13:03.855 Starting 1 thread 00:13:09.152 00:13:09.152 xnvme_bdev: (groupid=0, jobs=1): err= 0: pid=83595: Sun Dec 15 05:03:29 2024 00:13:09.152 read: IOPS=32.3k, BW=126MiB/s (132MB/s)(631MiB/5002msec) 00:13:09.152 slat (usec): min=2, max=103, avg= 3.55, stdev= 1.92 00:13:09.152 clat (usec): min=1088, max=5064, avg=1836.35, stdev=279.46 00:13:09.152 lat (usec): min=1091, max=5079, avg=1839.90, stdev=279.74 00:13:09.152 clat percentiles (usec): 00:13:09.152 | 1.00th=[ 1319], 5.00th=[ 1450], 10.00th=[ 1516], 20.00th=[ 1598], 00:13:09.152 | 30.00th=[ 1680], 40.00th=[ 1745], 50.00th=[ 1811], 60.00th=[ 1876], 00:13:09.152 | 70.00th=[ 1958], 80.00th=[ 2057], 90.00th=[ 2212], 95.00th=[ 2343], 00:13:09.152 | 99.00th=[ 2573], 99.50th=[ 2671], 99.90th=[ 3130], 99.95th=[ 3458], 00:13:09.152 | 99.99th=[ 5014] 00:13:09.152 bw ( KiB/s): min=126976, max=132096, per=100.00%, avg=129194.67, stdev=1949.64, samples=9 00:13:09.152 iops : min=31744, max=33024, avg=32298.67, stdev=487.41, samples=9 00:13:09.152 lat (msec) : 2=75.09%, 4=24.87%, 10=0.04% 00:13:09.152 cpu : usr=30.35%, sys=68.43%, ctx=10, majf=0, minf=771 00:13:09.152 IO depths : 1=1.6%, 2=3.1%, 4=6.2%, 8=12.5%, 16=25.0%, 32=50.0%, >=64=1.6% 00:13:09.152 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:13:09.152 complete : 0=0.0%, 4=98.5%, 8=0.0%, 16=0.0%, 32=0.0%, 64=1.5%, >=64=0.0% 00:13:09.152 issued rwts: total=161536,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:13:09.152 latency : target=0, window=0, percentile=100.00%, depth=64 00:13:09.152 00:13:09.152 Run status group 0 (all jobs): 00:13:09.152 READ: bw=126MiB/s (132MB/s), 126MiB/s-126MiB/s (132MB/s-132MB/s), io=631MiB (662MB), run=5002-5002msec 00:13:09.413 ----------------------------------------------------- 00:13:09.413 Suppressions used: 00:13:09.413 count bytes template 00:13:09.413 1 11 /usr/src/fio/parse.c 00:13:09.413 1 8 libtcmalloc_minimal.so 00:13:09.413 1 904 libcrypto.so 00:13:09.413 ----------------------------------------------------- 00:13:09.413 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@30 -- # for io_pattern in "${io_pattern_ref[@]}" 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # fio_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # gen_conf 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # local sanitizers 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- dd/common.sh@31 -- # xtrace_disable 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1345 -- # shift 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1347 -- # local asan_lib= 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # grep libasan 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:13:09.413 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:13:09.674 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:13:09.674 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1351 -- # break 00:13:09.674 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:13:09.674 05:03:29 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:13:09.674 { 00:13:09.674 "subsystems": [ 00:13:09.674 { 00:13:09.674 "subsystem": "bdev", 00:13:09.674 "config": [ 00:13:09.674 { 00:13:09.674 "params": { 00:13:09.674 "io_mechanism": "io_uring", 00:13:09.674 "conserve_cpu": false, 00:13:09.674 "filename": "/dev/nvme0n1", 00:13:09.674 "name": "xnvme_bdev" 00:13:09.674 }, 00:13:09.674 "method": "bdev_xnvme_create" 00:13:09.674 }, 00:13:09.674 { 00:13:09.674 "method": "bdev_wait_for_examine" 00:13:09.674 } 00:13:09.674 ] 00:13:09.674 } 00:13:09.674 ] 00:13:09.674 } 00:13:09.674 xnvme_bdev: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=64 00:13:09.674 fio-3.35 00:13:09.674 Starting 1 thread 00:13:16.258 00:13:16.258 xnvme_bdev: (groupid=0, jobs=1): err= 0: pid=83670: Sun Dec 15 05:03:35 2024 00:13:16.258 write: IOPS=32.8k, BW=128MiB/s (135MB/s)(642MiB/5001msec); 0 zone resets 00:13:16.258 slat (nsec): min=2896, max=70001, avg=3715.15, stdev=1875.43 00:13:16.258 clat (usec): min=316, max=8300, avg=1797.25, stdev=279.42 00:13:16.258 lat (usec): min=324, max=8303, avg=1800.96, stdev=279.70 00:13:16.258 clat percentiles (usec): 00:13:16.258 | 1.00th=[ 1303], 5.00th=[ 1418], 10.00th=[ 1483], 20.00th=[ 1582], 00:13:16.258 | 30.00th=[ 1647], 40.00th=[ 1696], 50.00th=[ 1762], 60.00th=[ 1827], 00:13:16.258 | 70.00th=[ 1909], 80.00th=[ 2008], 90.00th=[ 2147], 95.00th=[ 2278], 00:13:16.258 | 99.00th=[ 2573], 99.50th=[ 2704], 99.90th=[ 3163], 99.95th=[ 3556], 00:13:16.258 | 99.99th=[ 6456] 00:13:16.258 bw ( KiB/s): min=128040, max=135504, per=100.00%, avg=131739.56, stdev=2652.07, samples=9 00:13:16.258 iops : min=32010, max=33876, avg=32934.89, stdev=663.02, samples=9 00:13:16.258 lat (usec) : 500=0.02%, 750=0.02%, 1000=0.02% 00:13:16.258 lat (msec) : 2=79.93%, 4=19.99%, 10=0.02% 00:13:16.258 cpu : usr=30.82%, sys=67.86%, ctx=13, majf=0, minf=772 00:13:16.258 IO depths : 1=1.5%, 2=3.1%, 4=6.2%, 8=12.5%, 16=25.0%, 32=50.1%, >=64=1.6% 00:13:16.258 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:13:16.258 complete : 0=0.0%, 4=98.5%, 8=0.1%, 16=0.1%, 32=0.1%, 64=1.5%, >=64=0.0% 00:13:16.258 issued rwts: total=0,164263,0,0 short=0,0,0,0 dropped=0,0,0,0 00:13:16.258 latency : target=0, window=0, percentile=100.00%, depth=64 00:13:16.258 00:13:16.258 Run status group 0 (all jobs): 00:13:16.258 WRITE: bw=128MiB/s (135MB/s), 128MiB/s-128MiB/s (135MB/s-135MB/s), io=642MiB (673MB), run=5001-5001msec 00:13:16.258 ----------------------------------------------------- 00:13:16.258 Suppressions used: 00:13:16.258 count bytes template 00:13:16.258 1 11 /usr/src/fio/parse.c 00:13:16.258 1 8 libtcmalloc_minimal.so 00:13:16.258 1 904 libcrypto.so 00:13:16.258 ----------------------------------------------------- 00:13:16.258 00:13:16.258 00:13:16.258 real 0m12.009s 00:13:16.258 user 0m4.247s 00:13:16.258 sys 0m7.329s 00:13:16.258 05:03:35 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1130 -- # xtrace_disable 00:13:16.258 ************************************ 00:13:16.258 END TEST xnvme_fio_plugin 00:13:16.258 ************************************ 00:13:16.258 05:03:35 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:13:16.258 05:03:35 nvme_xnvme -- xnvme/xnvme.sh@82 -- # for cc in "${xnvme_conserve_cpu[@]}" 00:13:16.258 05:03:35 nvme_xnvme -- xnvme/xnvme.sh@83 -- # method_bdev_xnvme_create_0["conserve_cpu"]=true 00:13:16.258 05:03:35 nvme_xnvme -- xnvme/xnvme.sh@84 -- # conserve_cpu=true 00:13:16.258 05:03:35 nvme_xnvme -- xnvme/xnvme.sh@86 -- # run_test xnvme_rpc xnvme_rpc 00:13:16.258 05:03:35 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:13:16.258 05:03:35 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:13:16.258 05:03:35 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:13:16.258 ************************************ 00:13:16.258 START TEST xnvme_rpc 00:13:16.258 ************************************ 00:13:16.259 05:03:35 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@1129 -- # xnvme_rpc 00:13:16.259 05:03:35 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@48 -- # cc=() 00:13:16.259 05:03:35 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@48 -- # local -A cc 00:13:16.259 05:03:35 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@50 -- # cc["false"]= 00:13:16.259 05:03:35 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@50 -- # cc["true"]=-c 00:13:16.259 05:03:35 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@53 -- # spdk_tgt=83751 00:13:16.259 05:03:35 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@54 -- # waitforlisten 83751 00:13:16.259 05:03:35 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@835 -- # '[' -z 83751 ']' 00:13:16.259 05:03:35 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:13:16.259 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:13:16.259 05:03:35 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:13:16.259 05:03:35 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:13:16.259 05:03:35 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@52 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:13:16.259 05:03:35 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:13:16.259 05:03:35 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:16.259 [2024-12-15 05:03:35.679077] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:13:16.259 [2024-12-15 05:03:35.679235] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid83751 ] 00:13:16.259 [2024-12-15 05:03:35.835094] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:16.259 [2024-12-15 05:03:35.864092] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@868 -- # return 0 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@56 -- # rpc_cmd bdev_xnvme_create /dev/nvme0n1 xnvme_bdev io_uring -c 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:16.520 xnvme_bdev 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@62 -- # rpc_xnvme name 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.name' 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@62 -- # [[ xnvme_bdev == \x\n\v\m\e\_\b\d\e\v ]] 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@63 -- # rpc_xnvme filename 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.filename' 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@63 -- # [[ /dev/nvme0n1 == \/\d\e\v\/\n\v\m\e\0\n\1 ]] 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@64 -- # rpc_xnvme io_mechanism 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.io_mechanism' 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:13:16.520 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@64 -- # [[ io_uring == \i\o\_\u\r\i\n\g ]] 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@65 -- # rpc_xnvme conserve_cpu 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.conserve_cpu' 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@65 -- # [[ true == \t\r\u\e ]] 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@67 -- # rpc_cmd bdev_xnvme_delete xnvme_bdev 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@70 -- # killprocess 83751 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@954 -- # '[' -z 83751 ']' 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@958 -- # kill -0 83751 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@959 -- # uname 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 83751 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:13:16.782 killing process with pid 83751 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 83751' 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@973 -- # kill 83751 00:13:16.782 05:03:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@978 -- # wait 83751 00:13:17.044 00:13:17.044 real 0m1.438s 00:13:17.044 user 0m1.510s 00:13:17.044 sys 0m0.419s 00:13:17.044 05:03:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:13:17.044 ************************************ 00:13:17.044 END TEST xnvme_rpc 00:13:17.044 ************************************ 00:13:17.044 05:03:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:17.044 05:03:37 nvme_xnvme -- xnvme/xnvme.sh@87 -- # run_test xnvme_bdevperf xnvme_bdevperf 00:13:17.044 05:03:37 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:13:17.044 05:03:37 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:13:17.044 05:03:37 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:13:17.044 ************************************ 00:13:17.044 START TEST xnvme_bdevperf 00:13:17.044 ************************************ 00:13:17.044 05:03:37 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@1129 -- # xnvme_bdevperf 00:13:17.044 05:03:37 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@12 -- # local io_pattern 00:13:17.044 05:03:37 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@13 -- # local -n io_pattern_ref=io_uring 00:13:17.044 05:03:37 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:13:17.044 05:03:37 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w randread -t 5 -T xnvme_bdev -o 4096 00:13:17.044 05:03:37 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:13:17.044 05:03:37 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:13:17.044 05:03:37 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:13:17.044 { 00:13:17.044 "subsystems": [ 00:13:17.044 { 00:13:17.044 "subsystem": "bdev", 00:13:17.044 "config": [ 00:13:17.044 { 00:13:17.044 "params": { 00:13:17.044 "io_mechanism": "io_uring", 00:13:17.044 "conserve_cpu": true, 00:13:17.044 "filename": "/dev/nvme0n1", 00:13:17.044 "name": "xnvme_bdev" 00:13:17.044 }, 00:13:17.044 "method": "bdev_xnvme_create" 00:13:17.044 }, 00:13:17.044 { 00:13:17.044 "method": "bdev_wait_for_examine" 00:13:17.044 } 00:13:17.044 ] 00:13:17.044 } 00:13:17.044 ] 00:13:17.044 } 00:13:17.044 [2024-12-15 05:03:37.165939] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:13:17.044 [2024-12-15 05:03:37.166079] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid83808 ] 00:13:17.305 [2024-12-15 05:03:37.329699] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:17.305 [2024-12-15 05:03:37.358313] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:13:17.568 Running I/O for 5 seconds... 00:13:19.456 31825.00 IOPS, 124.32 MiB/s [2024-12-15T05:03:40.539Z] 32041.50 IOPS, 125.16 MiB/s [2024-12-15T05:03:41.484Z] 33068.67 IOPS, 129.17 MiB/s [2024-12-15T05:03:42.876Z] 33114.00 IOPS, 129.35 MiB/s 00:13:22.736 Latency(us) 00:13:22.736 [2024-12-15T05:03:42.876Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:22.736 Job: xnvme_bdev (Core Mask 0x1, workload: randread, depth: 64, IO size: 4096) 00:13:22.736 xnvme_bdev : 5.00 33164.20 129.55 0.00 0.00 1925.80 938.93 12703.90 00:13:22.736 [2024-12-15T05:03:42.876Z] =================================================================================================================== 00:13:22.736 [2024-12-15T05:03:42.876Z] Total : 33164.20 129.55 0.00 0.00 1925.80 938.93 12703.90 00:13:22.736 05:03:42 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:13:22.736 05:03:42 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w randwrite -t 5 -T xnvme_bdev -o 4096 00:13:22.736 05:03:42 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:13:22.736 05:03:42 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:13:22.736 05:03:42 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:13:22.736 { 00:13:22.736 "subsystems": [ 00:13:22.736 { 00:13:22.736 "subsystem": "bdev", 00:13:22.736 "config": [ 00:13:22.736 { 00:13:22.736 "params": { 00:13:22.736 "io_mechanism": "io_uring", 00:13:22.736 "conserve_cpu": true, 00:13:22.736 "filename": "/dev/nvme0n1", 00:13:22.736 "name": "xnvme_bdev" 00:13:22.736 }, 00:13:22.736 "method": "bdev_xnvme_create" 00:13:22.736 }, 00:13:22.736 { 00:13:22.736 "method": "bdev_wait_for_examine" 00:13:22.736 } 00:13:22.736 ] 00:13:22.736 } 00:13:22.736 ] 00:13:22.736 } 00:13:22.736 [2024-12-15 05:03:42.717588] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:13:22.736 [2024-12-15 05:03:42.717727] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid83878 ] 00:13:22.736 [2024-12-15 05:03:42.873089] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:22.997 [2024-12-15 05:03:42.901491] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:13:22.997 Running I/O for 5 seconds... 00:13:24.886 33244.00 IOPS, 129.86 MiB/s [2024-12-15T05:03:46.416Z] 33541.50 IOPS, 131.02 MiB/s [2024-12-15T05:03:47.362Z] 33662.33 IOPS, 131.49 MiB/s [2024-12-15T05:03:48.307Z] 33882.75 IOPS, 132.35 MiB/s [2024-12-15T05:03:48.307Z] 33944.80 IOPS, 132.60 MiB/s 00:13:28.167 Latency(us) 00:13:28.167 [2024-12-15T05:03:48.307Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:28.167 Job: xnvme_bdev (Core Mask 0x1, workload: randwrite, depth: 64, IO size: 4096) 00:13:28.167 xnvme_bdev : 5.00 33932.24 132.55 0.00 0.00 1881.87 699.47 9527.93 00:13:28.167 [2024-12-15T05:03:48.307Z] =================================================================================================================== 00:13:28.167 [2024-12-15T05:03:48.307Z] Total : 33932.24 132.55 0.00 0.00 1881.87 699.47 9527.93 00:13:28.167 00:13:28.167 real 0m11.095s 00:13:28.167 user 0m6.472s 00:13:28.167 sys 0m4.075s 00:13:28.167 ************************************ 00:13:28.167 05:03:48 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:13:28.167 05:03:48 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:13:28.167 END TEST xnvme_bdevperf 00:13:28.167 ************************************ 00:13:28.167 05:03:48 nvme_xnvme -- xnvme/xnvme.sh@88 -- # run_test xnvme_fio_plugin xnvme_fio_plugin 00:13:28.167 05:03:48 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:13:28.167 05:03:48 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:13:28.167 05:03:48 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:13:28.167 ************************************ 00:13:28.167 START TEST xnvme_fio_plugin 00:13:28.167 ************************************ 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1129 -- # xnvme_fio_plugin 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@27 -- # local io_pattern 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@28 -- # local -n io_pattern_ref=io_uring_fio 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@30 -- # for io_pattern in "${io_pattern_ref[@]}" 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # fio_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # local sanitizers 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # gen_conf 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1345 -- # shift 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1347 -- # local asan_lib= 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- dd/common.sh@31 -- # xtrace_disable 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # grep libasan 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1351 -- # break 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:13:28.167 05:03:48 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:13:28.167 { 00:13:28.167 "subsystems": [ 00:13:28.167 { 00:13:28.167 "subsystem": "bdev", 00:13:28.167 "config": [ 00:13:28.167 { 00:13:28.167 "params": { 00:13:28.167 "io_mechanism": "io_uring", 00:13:28.167 "conserve_cpu": true, 00:13:28.167 "filename": "/dev/nvme0n1", 00:13:28.167 "name": "xnvme_bdev" 00:13:28.167 }, 00:13:28.167 "method": "bdev_xnvme_create" 00:13:28.167 }, 00:13:28.167 { 00:13:28.167 "method": "bdev_wait_for_examine" 00:13:28.167 } 00:13:28.167 ] 00:13:28.167 } 00:13:28.167 ] 00:13:28.167 } 00:13:28.429 xnvme_bdev: (g=0): rw=randread, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=64 00:13:28.429 fio-3.35 00:13:28.429 Starting 1 thread 00:13:35.024 00:13:35.024 xnvme_bdev: (groupid=0, jobs=1): err= 0: pid=83981: Sun Dec 15 05:03:53 2024 00:13:35.024 read: IOPS=35.1k, BW=137MiB/s (144MB/s)(685MiB/5002msec) 00:13:35.024 slat (usec): min=2, max=146, avg= 3.99, stdev= 2.24 00:13:35.024 clat (usec): min=950, max=3774, avg=1662.59, stdev=263.80 00:13:35.024 lat (usec): min=953, max=3778, avg=1666.58, stdev=264.41 00:13:35.024 clat percentiles (usec): 00:13:35.024 | 1.00th=[ 1123], 5.00th=[ 1270], 10.00th=[ 1369], 20.00th=[ 1450], 00:13:35.024 | 30.00th=[ 1516], 40.00th=[ 1582], 50.00th=[ 1631], 60.00th=[ 1696], 00:13:35.024 | 70.00th=[ 1762], 80.00th=[ 1860], 90.00th=[ 2008], 95.00th=[ 2147], 00:13:35.024 | 99.00th=[ 2442], 99.50th=[ 2540], 99.90th=[ 2835], 99.95th=[ 3032], 00:13:35.024 | 99.99th=[ 3228] 00:13:35.024 bw ( KiB/s): min=131584, max=158208, per=100.00%, avg=140629.33, stdev=8001.79, samples=9 00:13:35.024 iops : min=32896, max=39552, avg=35157.33, stdev=2000.45, samples=9 00:13:35.024 lat (usec) : 1000=0.05% 00:13:35.024 lat (msec) : 2=89.83%, 4=10.12% 00:13:35.024 cpu : usr=43.29%, sys=52.37%, ctx=14, majf=0, minf=771 00:13:35.024 IO depths : 1=1.6%, 2=3.1%, 4=6.2%, 8=12.5%, 16=25.0%, 32=50.0%, >=64=1.6% 00:13:35.024 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:13:35.024 complete : 0=0.0%, 4=98.5%, 8=0.0%, 16=0.0%, 32=0.0%, 64=1.5%, >=64=0.0% 00:13:35.024 issued rwts: total=175442,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:13:35.024 latency : target=0, window=0, percentile=100.00%, depth=64 00:13:35.024 00:13:35.024 Run status group 0 (all jobs): 00:13:35.024 READ: bw=137MiB/s (144MB/s), 137MiB/s-137MiB/s (144MB/s-144MB/s), io=685MiB (719MB), run=5002-5002msec 00:13:35.024 ----------------------------------------------------- 00:13:35.024 Suppressions used: 00:13:35.024 count bytes template 00:13:35.024 1 11 /usr/src/fio/parse.c 00:13:35.024 1 8 libtcmalloc_minimal.so 00:13:35.024 1 904 libcrypto.so 00:13:35.024 ----------------------------------------------------- 00:13:35.024 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@30 -- # for io_pattern in "${io_pattern_ref[@]}" 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # fio_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # gen_conf 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- dd/common.sh@31 -- # xtrace_disable 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # local sanitizers 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1345 -- # shift 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1347 -- # local asan_lib= 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # grep libasan 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1351 -- # break 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:13:35.024 05:03:54 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:13:35.024 { 00:13:35.024 "subsystems": [ 00:13:35.024 { 00:13:35.024 "subsystem": "bdev", 00:13:35.024 "config": [ 00:13:35.024 { 00:13:35.024 "params": { 00:13:35.024 "io_mechanism": "io_uring", 00:13:35.024 "conserve_cpu": true, 00:13:35.024 "filename": "/dev/nvme0n1", 00:13:35.024 "name": "xnvme_bdev" 00:13:35.024 }, 00:13:35.024 "method": "bdev_xnvme_create" 00:13:35.024 }, 00:13:35.024 { 00:13:35.024 "method": "bdev_wait_for_examine" 00:13:35.024 } 00:13:35.024 ] 00:13:35.024 } 00:13:35.024 ] 00:13:35.024 } 00:13:35.024 xnvme_bdev: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=64 00:13:35.024 fio-3.35 00:13:35.024 Starting 1 thread 00:13:40.394 00:13:40.394 xnvme_bdev: (groupid=0, jobs=1): err= 0: pid=84067: Sun Dec 15 05:03:59 2024 00:13:40.394 write: IOPS=34.8k, BW=136MiB/s (143MB/s)(680MiB/5002msec); 0 zone resets 00:13:40.394 slat (nsec): min=2903, max=69091, avg=4178.17, stdev=2287.83 00:13:40.394 clat (usec): min=491, max=7459, avg=1666.20, stdev=254.90 00:13:40.394 lat (usec): min=495, max=7462, avg=1670.38, stdev=255.36 00:13:40.394 clat percentiles (usec): 00:13:40.394 | 1.00th=[ 1221], 5.00th=[ 1319], 10.00th=[ 1385], 20.00th=[ 1467], 00:13:40.394 | 30.00th=[ 1516], 40.00th=[ 1582], 50.00th=[ 1631], 60.00th=[ 1696], 00:13:40.394 | 70.00th=[ 1762], 80.00th=[ 1844], 90.00th=[ 1991], 95.00th=[ 2114], 00:13:40.394 | 99.00th=[ 2409], 99.50th=[ 2573], 99.90th=[ 2999], 99.95th=[ 3195], 00:13:40.394 | 99.99th=[ 4146] 00:13:40.394 bw ( KiB/s): min=132544, max=143360, per=99.91%, avg=139120.89, stdev=3611.07, samples=9 00:13:40.394 iops : min=33136, max=35840, avg=34780.22, stdev=902.77, samples=9 00:13:40.394 lat (usec) : 500=0.01%, 750=0.01%, 1000=0.01% 00:13:40.394 lat (msec) : 2=90.45%, 4=9.52%, 10=0.01% 00:13:40.394 cpu : usr=41.21%, sys=54.15%, ctx=12, majf=0, minf=772 00:13:40.394 IO depths : 1=1.5%, 2=3.1%, 4=6.2%, 8=12.5%, 16=25.0%, 32=50.1%, >=64=1.6% 00:13:40.394 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:13:40.394 complete : 0=0.0%, 4=98.5%, 8=0.1%, 16=0.0%, 32=0.1%, 64=1.5%, >=64=0.0% 00:13:40.394 issued rwts: total=0,174127,0,0 short=0,0,0,0 dropped=0,0,0,0 00:13:40.394 latency : target=0, window=0, percentile=100.00%, depth=64 00:13:40.394 00:13:40.394 Run status group 0 (all jobs): 00:13:40.394 WRITE: bw=136MiB/s (143MB/s), 136MiB/s-136MiB/s (143MB/s-143MB/s), io=680MiB (713MB), run=5002-5002msec 00:13:40.394 ----------------------------------------------------- 00:13:40.394 Suppressions used: 00:13:40.394 count bytes template 00:13:40.394 1 11 /usr/src/fio/parse.c 00:13:40.394 1 8 libtcmalloc_minimal.so 00:13:40.394 1 904 libcrypto.so 00:13:40.394 ----------------------------------------------------- 00:13:40.394 00:13:40.394 00:13:40.394 real 0m12.050s 00:13:40.394 user 0m5.392s 00:13:40.394 sys 0m5.900s 00:13:40.394 05:04:00 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1130 -- # xtrace_disable 00:13:40.394 ************************************ 00:13:40.394 END TEST xnvme_fio_plugin 00:13:40.394 ************************************ 00:13:40.394 05:04:00 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:13:40.394 05:04:00 nvme_xnvme -- xnvme/xnvme.sh@75 -- # for io in "${xnvme_io[@]}" 00:13:40.394 05:04:00 nvme_xnvme -- xnvme/xnvme.sh@76 -- # method_bdev_xnvme_create_0["io_mechanism"]=io_uring_cmd 00:13:40.394 05:04:00 nvme_xnvme -- xnvme/xnvme.sh@77 -- # method_bdev_xnvme_create_0["filename"]=/dev/ng0n1 00:13:40.394 05:04:00 nvme_xnvme -- xnvme/xnvme.sh@79 -- # filename=/dev/ng0n1 00:13:40.394 05:04:00 nvme_xnvme -- xnvme/xnvme.sh@80 -- # name=xnvme_bdev 00:13:40.394 05:04:00 nvme_xnvme -- xnvme/xnvme.sh@82 -- # for cc in "${xnvme_conserve_cpu[@]}" 00:13:40.394 05:04:00 nvme_xnvme -- xnvme/xnvme.sh@83 -- # method_bdev_xnvme_create_0["conserve_cpu"]=false 00:13:40.394 05:04:00 nvme_xnvme -- xnvme/xnvme.sh@84 -- # conserve_cpu=false 00:13:40.394 05:04:00 nvme_xnvme -- xnvme/xnvme.sh@86 -- # run_test xnvme_rpc xnvme_rpc 00:13:40.394 05:04:00 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:13:40.394 05:04:00 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:13:40.394 05:04:00 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:13:40.394 ************************************ 00:13:40.394 START TEST xnvme_rpc 00:13:40.394 ************************************ 00:13:40.394 05:04:00 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@1129 -- # xnvme_rpc 00:13:40.394 05:04:00 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@48 -- # cc=() 00:13:40.394 05:04:00 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@48 -- # local -A cc 00:13:40.394 05:04:00 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@50 -- # cc["false"]= 00:13:40.394 05:04:00 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@50 -- # cc["true"]=-c 00:13:40.394 05:04:00 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@53 -- # spdk_tgt=84142 00:13:40.394 05:04:00 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@54 -- # waitforlisten 84142 00:13:40.394 05:04:00 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@835 -- # '[' -z 84142 ']' 00:13:40.394 05:04:00 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:13:40.394 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:13:40.394 05:04:00 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:13:40.395 05:04:00 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:13:40.395 05:04:00 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:13:40.395 05:04:00 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@52 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:13:40.395 05:04:00 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:40.395 [2024-12-15 05:04:00.467149] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:13:40.395 [2024-12-15 05:04:00.467850] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid84142 ] 00:13:40.656 [2024-12-15 05:04:00.626874] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:40.656 [2024-12-15 05:04:00.655274] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@868 -- # return 0 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@56 -- # rpc_cmd bdev_xnvme_create /dev/ng0n1 xnvme_bdev io_uring_cmd '' 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:41.231 xnvme_bdev 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@62 -- # rpc_xnvme name 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.name' 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@62 -- # [[ xnvme_bdev == \x\n\v\m\e\_\b\d\e\v ]] 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@63 -- # rpc_xnvme filename 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.filename' 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:13:41.231 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@63 -- # [[ /dev/ng0n1 == \/\d\e\v\/\n\g\0\n\1 ]] 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@64 -- # rpc_xnvme io_mechanism 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.io_mechanism' 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@64 -- # [[ io_uring_cmd == \i\o\_\u\r\i\n\g\_\c\m\d ]] 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@65 -- # rpc_xnvme conserve_cpu 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.conserve_cpu' 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@65 -- # [[ false == \f\a\l\s\e ]] 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@67 -- # rpc_cmd bdev_xnvme_delete xnvme_bdev 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@70 -- # killprocess 84142 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@954 -- # '[' -z 84142 ']' 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@958 -- # kill -0 84142 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@959 -- # uname 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 84142 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:13:41.493 killing process with pid 84142 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 84142' 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@973 -- # kill 84142 00:13:41.493 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@978 -- # wait 84142 00:13:41.755 00:13:41.755 real 0m1.449s 00:13:41.755 user 0m1.537s 00:13:41.755 sys 0m0.400s 00:13:41.755 ************************************ 00:13:41.755 END TEST xnvme_rpc 00:13:41.755 ************************************ 00:13:41.755 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:13:41.755 05:04:01 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:13:41.755 05:04:01 nvme_xnvme -- xnvme/xnvme.sh@87 -- # run_test xnvme_bdevperf xnvme_bdevperf 00:13:41.755 05:04:01 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:13:41.755 05:04:01 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:13:41.755 05:04:01 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:13:42.018 ************************************ 00:13:42.018 START TEST xnvme_bdevperf 00:13:42.018 ************************************ 00:13:42.018 05:04:01 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@1129 -- # xnvme_bdevperf 00:13:42.018 05:04:01 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@12 -- # local io_pattern 00:13:42.018 05:04:01 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@13 -- # local -n io_pattern_ref=io_uring_cmd 00:13:42.018 05:04:01 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:13:42.018 05:04:01 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w randread -t 5 -T xnvme_bdev -o 4096 00:13:42.018 05:04:01 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:13:42.018 05:04:01 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:13:42.018 05:04:01 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:13:42.018 { 00:13:42.018 "subsystems": [ 00:13:42.018 { 00:13:42.018 "subsystem": "bdev", 00:13:42.018 "config": [ 00:13:42.018 { 00:13:42.018 "params": { 00:13:42.018 "io_mechanism": "io_uring_cmd", 00:13:42.018 "conserve_cpu": false, 00:13:42.018 "filename": "/dev/ng0n1", 00:13:42.018 "name": "xnvme_bdev" 00:13:42.018 }, 00:13:42.018 "method": "bdev_xnvme_create" 00:13:42.018 }, 00:13:42.018 { 00:13:42.018 "method": "bdev_wait_for_examine" 00:13:42.018 } 00:13:42.018 ] 00:13:42.018 } 00:13:42.018 ] 00:13:42.018 } 00:13:42.018 [2024-12-15 05:04:01.975144] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:13:42.018 [2024-12-15 05:04:01.975312] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid84200 ] 00:13:42.018 [2024-12-15 05:04:02.141831] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:42.280 [2024-12-15 05:04:02.170190] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:13:42.280 Running I/O for 5 seconds... 00:13:44.301 34176.00 IOPS, 133.50 MiB/s [2024-12-15T05:04:05.385Z] 34945.00 IOPS, 136.50 MiB/s [2024-12-15T05:04:06.329Z] 34987.00 IOPS, 136.67 MiB/s [2024-12-15T05:04:07.715Z] 34874.00 IOPS, 136.23 MiB/s [2024-12-15T05:04:07.715Z] 34848.40 IOPS, 136.13 MiB/s 00:13:47.575 Latency(us) 00:13:47.575 [2024-12-15T05:04:07.715Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:47.575 Job: xnvme_bdev (Core Mask 0x1, workload: randread, depth: 64, IO size: 4096) 00:13:47.575 xnvme_bdev : 5.01 34824.40 136.03 0.00 0.00 1833.27 466.31 6402.36 00:13:47.575 [2024-12-15T05:04:07.715Z] =================================================================================================================== 00:13:47.575 [2024-12-15T05:04:07.715Z] Total : 34824.40 136.03 0.00 0.00 1833.27 466.31 6402.36 00:13:47.575 05:04:07 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:13:47.575 05:04:07 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:13:47.575 05:04:07 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w randwrite -t 5 -T xnvme_bdev -o 4096 00:13:47.575 05:04:07 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:13:47.575 05:04:07 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:13:47.575 { 00:13:47.575 "subsystems": [ 00:13:47.575 { 00:13:47.575 "subsystem": "bdev", 00:13:47.575 "config": [ 00:13:47.575 { 00:13:47.575 "params": { 00:13:47.575 "io_mechanism": "io_uring_cmd", 00:13:47.575 "conserve_cpu": false, 00:13:47.575 "filename": "/dev/ng0n1", 00:13:47.575 "name": "xnvme_bdev" 00:13:47.575 }, 00:13:47.575 "method": "bdev_xnvme_create" 00:13:47.575 }, 00:13:47.576 { 00:13:47.576 "method": "bdev_wait_for_examine" 00:13:47.576 } 00:13:47.576 ] 00:13:47.576 } 00:13:47.576 ] 00:13:47.576 } 00:13:47.576 [2024-12-15 05:04:07.525621] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:13:47.576 [2024-12-15 05:04:07.525760] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid84272 ] 00:13:47.576 [2024-12-15 05:04:07.685933] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:47.836 [2024-12-15 05:04:07.714933] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:13:47.837 Running I/O for 5 seconds... 00:13:49.720 36059.00 IOPS, 140.86 MiB/s [2024-12-15T05:04:11.245Z] 36329.00 IOPS, 141.91 MiB/s [2024-12-15T05:04:11.820Z] 36152.33 IOPS, 141.22 MiB/s [2024-12-15T05:04:13.205Z] 36208.25 IOPS, 141.44 MiB/s [2024-12-15T05:04:13.205Z] 36290.00 IOPS, 141.76 MiB/s 00:13:53.065 Latency(us) 00:13:53.065 [2024-12-15T05:04:13.205Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:53.065 Job: xnvme_bdev (Core Mask 0x1, workload: randwrite, depth: 64, IO size: 4096) 00:13:53.065 xnvme_bdev : 5.01 36240.37 141.56 0.00 0.00 1761.23 352.89 5747.00 00:13:53.065 [2024-12-15T05:04:13.205Z] =================================================================================================================== 00:13:53.065 [2024-12-15T05:04:13.205Z] Total : 36240.37 141.56 0.00 0.00 1761.23 352.89 5747.00 00:13:53.065 05:04:12 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:13:53.065 05:04:12 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w unmap -t 5 -T xnvme_bdev -o 4096 00:13:53.065 05:04:12 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:13:53.065 05:04:12 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:13:53.065 05:04:12 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:13:53.065 { 00:13:53.065 "subsystems": [ 00:13:53.065 { 00:13:53.065 "subsystem": "bdev", 00:13:53.065 "config": [ 00:13:53.065 { 00:13:53.065 "params": { 00:13:53.065 "io_mechanism": "io_uring_cmd", 00:13:53.065 "conserve_cpu": false, 00:13:53.065 "filename": "/dev/ng0n1", 00:13:53.065 "name": "xnvme_bdev" 00:13:53.065 }, 00:13:53.065 "method": "bdev_xnvme_create" 00:13:53.065 }, 00:13:53.065 { 00:13:53.065 "method": "bdev_wait_for_examine" 00:13:53.065 } 00:13:53.065 ] 00:13:53.065 } 00:13:53.065 ] 00:13:53.065 } 00:13:53.065 [2024-12-15 05:04:13.066454] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:13:53.065 [2024-12-15 05:04:13.066609] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid84335 ] 00:13:53.327 [2024-12-15 05:04:13.226095] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:53.327 [2024-12-15 05:04:13.254862] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:13:53.327 Running I/O for 5 seconds... 00:13:55.655 74752.00 IOPS, 292.00 MiB/s [2024-12-15T05:04:16.368Z] 77056.00 IOPS, 301.00 MiB/s [2024-12-15T05:04:17.753Z] 77909.33 IOPS, 304.33 MiB/s [2024-12-15T05:04:18.697Z] 78288.00 IOPS, 305.81 MiB/s [2024-12-15T05:04:18.697Z] 79795.20 IOPS, 311.70 MiB/s 00:13:58.557 Latency(us) 00:13:58.557 [2024-12-15T05:04:18.697Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:13:58.557 Job: xnvme_bdev (Core Mask 0x1, workload: unmap, depth: 64, IO size: 4096) 00:13:58.557 xnvme_bdev : 5.00 79765.02 311.58 0.00 0.00 798.94 469.46 2470.20 00:13:58.557 [2024-12-15T05:04:18.697Z] =================================================================================================================== 00:13:58.557 [2024-12-15T05:04:18.697Z] Total : 79765.02 311.58 0.00 0.00 798.94 469.46 2470.20 00:13:58.557 05:04:18 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:13:58.557 05:04:18 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w write_zeroes -t 5 -T xnvme_bdev -o 4096 00:13:58.557 05:04:18 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:13:58.557 05:04:18 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:13:58.557 05:04:18 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:13:58.557 { 00:13:58.557 "subsystems": [ 00:13:58.557 { 00:13:58.557 "subsystem": "bdev", 00:13:58.557 "config": [ 00:13:58.557 { 00:13:58.557 "params": { 00:13:58.557 "io_mechanism": "io_uring_cmd", 00:13:58.557 "conserve_cpu": false, 00:13:58.557 "filename": "/dev/ng0n1", 00:13:58.557 "name": "xnvme_bdev" 00:13:58.557 }, 00:13:58.557 "method": "bdev_xnvme_create" 00:13:58.557 }, 00:13:58.557 { 00:13:58.557 "method": "bdev_wait_for_examine" 00:13:58.557 } 00:13:58.557 ] 00:13:58.557 } 00:13:58.557 ] 00:13:58.557 } 00:13:58.557 [2024-12-15 05:04:18.612787] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:13:58.557 [2024-12-15 05:04:18.612934] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid84406 ] 00:13:58.819 [2024-12-15 05:04:18.774657] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:58.819 [2024-12-15 05:04:18.802909] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:13:58.819 Running I/O for 5 seconds... 00:14:00.827 47838.00 IOPS, 186.87 MiB/s [2024-12-15T05:04:22.351Z] 46687.50 IOPS, 182.37 MiB/s [2024-12-15T05:04:22.920Z] 45662.67 IOPS, 178.37 MiB/s [2024-12-15T05:04:24.303Z] 45900.75 IOPS, 179.30 MiB/s [2024-12-15T05:04:24.303Z] 44094.20 IOPS, 172.24 MiB/s 00:14:04.163 Latency(us) 00:14:04.163 [2024-12-15T05:04:24.303Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:04.163 Job: xnvme_bdev (Core Mask 0x1, workload: write_zeroes, depth: 64, IO size: 4096) 00:14:04.163 xnvme_bdev : 5.00 44077.27 172.18 0.00 0.00 1448.19 160.69 22483.89 00:14:04.163 [2024-12-15T05:04:24.303Z] =================================================================================================================== 00:14:04.163 [2024-12-15T05:04:24.303Z] Total : 44077.27 172.18 0.00 0.00 1448.19 160.69 22483.89 00:14:04.163 00:14:04.163 real 0m22.190s 00:14:04.163 user 0m9.708s 00:14:04.163 sys 0m11.965s 00:14:04.163 05:04:24 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:14:04.163 ************************************ 00:14:04.163 05:04:24 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:14:04.163 END TEST xnvme_bdevperf 00:14:04.163 ************************************ 00:14:04.163 05:04:24 nvme_xnvme -- xnvme/xnvme.sh@88 -- # run_test xnvme_fio_plugin xnvme_fio_plugin 00:14:04.163 05:04:24 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:14:04.163 05:04:24 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:14:04.163 05:04:24 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:04.163 ************************************ 00:14:04.163 START TEST xnvme_fio_plugin 00:14:04.163 ************************************ 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1129 -- # xnvme_fio_plugin 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@27 -- # local io_pattern 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@28 -- # local -n io_pattern_ref=io_uring_cmd_fio 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@30 -- # for io_pattern in "${io_pattern_ref[@]}" 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # fio_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # local sanitizers 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # gen_conf 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1345 -- # shift 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1347 -- # local asan_lib= 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- dd/common.sh@31 -- # xtrace_disable 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # grep libasan 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1351 -- # break 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:14:04.163 05:04:24 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:14:04.163 { 00:14:04.163 "subsystems": [ 00:14:04.163 { 00:14:04.163 "subsystem": "bdev", 00:14:04.163 "config": [ 00:14:04.163 { 00:14:04.163 "params": { 00:14:04.163 "io_mechanism": "io_uring_cmd", 00:14:04.163 "conserve_cpu": false, 00:14:04.163 "filename": "/dev/ng0n1", 00:14:04.163 "name": "xnvme_bdev" 00:14:04.163 }, 00:14:04.163 "method": "bdev_xnvme_create" 00:14:04.163 }, 00:14:04.163 { 00:14:04.163 "method": "bdev_wait_for_examine" 00:14:04.163 } 00:14:04.163 ] 00:14:04.163 } 00:14:04.163 ] 00:14:04.163 } 00:14:04.424 xnvme_bdev: (g=0): rw=randread, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=64 00:14:04.424 fio-3.35 00:14:04.424 Starting 1 thread 00:14:09.716 00:14:09.717 xnvme_bdev: (groupid=0, jobs=1): err= 0: pid=84508: Sun Dec 15 05:04:29 2024 00:14:09.717 read: IOPS=41.0k, BW=160MiB/s (168MB/s)(801MiB/5001msec) 00:14:09.717 slat (usec): min=2, max=212, avg= 3.51, stdev= 1.67 00:14:09.717 clat (usec): min=859, max=3411, avg=1423.04, stdev=255.31 00:14:09.717 lat (usec): min=862, max=3446, avg=1426.55, stdev=255.76 00:14:09.717 clat percentiles (usec): 00:14:09.717 | 1.00th=[ 1020], 5.00th=[ 1090], 10.00th=[ 1139], 20.00th=[ 1205], 00:14:09.717 | 30.00th=[ 1254], 40.00th=[ 1319], 50.00th=[ 1385], 60.00th=[ 1450], 00:14:09.717 | 70.00th=[ 1532], 80.00th=[ 1614], 90.00th=[ 1762], 95.00th=[ 1893], 00:14:09.717 | 99.00th=[ 2212], 99.50th=[ 2343], 99.90th=[ 2573], 99.95th=[ 2704], 00:14:09.717 | 99.99th=[ 3195] 00:14:09.717 bw ( KiB/s): min=144896, max=183441, per=100.00%, avg=166233.11, stdev=13934.92, samples=9 00:14:09.717 iops : min=36224, max=45860, avg=41558.22, stdev=3483.72, samples=9 00:14:09.717 lat (usec) : 1000=0.67% 00:14:09.717 lat (msec) : 2=96.60%, 4=2.73% 00:14:09.717 cpu : usr=37.16%, sys=61.72%, ctx=17, majf=0, minf=771 00:14:09.717 IO depths : 1=1.6%, 2=3.1%, 4=6.2%, 8=12.5%, 16=25.0%, 32=50.0%, >=64=1.6% 00:14:09.717 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:14:09.717 complete : 0=0.0%, 4=98.5%, 8=0.0%, 16=0.0%, 32=0.0%, 64=1.5%, >=64=0.0% 00:14:09.717 issued rwts: total=204992,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:14:09.717 latency : target=0, window=0, percentile=100.00%, depth=64 00:14:09.717 00:14:09.717 Run status group 0 (all jobs): 00:14:09.717 READ: bw=160MiB/s (168MB/s), 160MiB/s-160MiB/s (168MB/s-168MB/s), io=801MiB (840MB), run=5001-5001msec 00:14:10.289 ----------------------------------------------------- 00:14:10.289 Suppressions used: 00:14:10.289 count bytes template 00:14:10.289 1 11 /usr/src/fio/parse.c 00:14:10.289 1 8 libtcmalloc_minimal.so 00:14:10.289 1 904 libcrypto.so 00:14:10.289 ----------------------------------------------------- 00:14:10.289 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@30 -- # for io_pattern in "${io_pattern_ref[@]}" 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # fio_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # local sanitizers 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # gen_conf 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1345 -- # shift 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1347 -- # local asan_lib= 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- dd/common.sh@31 -- # xtrace_disable 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # grep libasan 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1351 -- # break 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:14:10.289 05:04:30 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:14:10.289 { 00:14:10.289 "subsystems": [ 00:14:10.289 { 00:14:10.289 "subsystem": "bdev", 00:14:10.289 "config": [ 00:14:10.289 { 00:14:10.289 "params": { 00:14:10.289 "io_mechanism": "io_uring_cmd", 00:14:10.289 "conserve_cpu": false, 00:14:10.289 "filename": "/dev/ng0n1", 00:14:10.289 "name": "xnvme_bdev" 00:14:10.289 }, 00:14:10.289 "method": "bdev_xnvme_create" 00:14:10.289 }, 00:14:10.289 { 00:14:10.289 "method": "bdev_wait_for_examine" 00:14:10.289 } 00:14:10.289 ] 00:14:10.289 } 00:14:10.289 ] 00:14:10.289 } 00:14:10.290 xnvme_bdev: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=64 00:14:10.290 fio-3.35 00:14:10.290 Starting 1 thread 00:14:16.877 00:14:16.877 xnvme_bdev: (groupid=0, jobs=1): err= 0: pid=84593: Sun Dec 15 05:04:35 2024 00:14:16.877 write: IOPS=37.2k, BW=145MiB/s (152MB/s)(727MiB/5001msec); 0 zone resets 00:14:16.877 slat (nsec): min=2915, max=93436, avg=4134.41, stdev=2203.78 00:14:16.877 clat (usec): min=171, max=5716, avg=1558.26, stdev=278.74 00:14:16.877 lat (usec): min=179, max=5720, avg=1562.40, stdev=279.04 00:14:16.877 clat percentiles (usec): 00:14:16.877 | 1.00th=[ 816], 5.00th=[ 1172], 10.00th=[ 1270], 20.00th=[ 1352], 00:14:16.877 | 30.00th=[ 1418], 40.00th=[ 1483], 50.00th=[ 1549], 60.00th=[ 1598], 00:14:16.877 | 70.00th=[ 1663], 80.00th=[ 1745], 90.00th=[ 1876], 95.00th=[ 2008], 00:14:16.877 | 99.00th=[ 2311], 99.50th=[ 2540], 99.90th=[ 3326], 99.95th=[ 3621], 00:14:16.877 | 99.99th=[ 4621] 00:14:16.877 bw ( KiB/s): min=143184, max=151440, per=100.00%, avg=148988.44, stdev=2580.41, samples=9 00:14:16.877 iops : min=35796, max=37860, avg=37247.11, stdev=645.10, samples=9 00:14:16.877 lat (usec) : 250=0.01%, 500=0.05%, 750=0.58%, 1000=1.26% 00:14:16.877 lat (msec) : 2=93.08%, 4=5.00%, 10=0.02% 00:14:16.877 cpu : usr=34.08%, sys=64.48%, ctx=10, majf=0, minf=772 00:14:16.877 IO depths : 1=1.4%, 2=2.9%, 4=5.8%, 8=11.7%, 16=23.8%, 32=52.7%, >=64=1.7% 00:14:16.877 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:14:16.877 complete : 0=0.0%, 4=98.4%, 8=0.1%, 16=0.1%, 32=0.1%, 64=1.5%, >=64=0.0% 00:14:16.877 issued rwts: total=0,186031,0,0 short=0,0,0,0 dropped=0,0,0,0 00:14:16.877 latency : target=0, window=0, percentile=100.00%, depth=64 00:14:16.877 00:14:16.877 Run status group 0 (all jobs): 00:14:16.877 WRITE: bw=145MiB/s (152MB/s), 145MiB/s-145MiB/s (152MB/s-152MB/s), io=727MiB (762MB), run=5001-5001msec 00:14:16.877 ----------------------------------------------------- 00:14:16.877 Suppressions used: 00:14:16.877 count bytes template 00:14:16.877 1 11 /usr/src/fio/parse.c 00:14:16.877 1 8 libtcmalloc_minimal.so 00:14:16.877 1 904 libcrypto.so 00:14:16.877 ----------------------------------------------------- 00:14:16.877 00:14:16.877 ************************************ 00:14:16.877 END TEST xnvme_fio_plugin 00:14:16.877 ************************************ 00:14:16.877 00:14:16.877 real 0m12.010s 00:14:16.877 user 0m4.688s 00:14:16.877 sys 0m6.878s 00:14:16.877 05:04:36 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1130 -- # xtrace_disable 00:14:16.877 05:04:36 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:14:16.877 05:04:36 nvme_xnvme -- xnvme/xnvme.sh@82 -- # for cc in "${xnvme_conserve_cpu[@]}" 00:14:16.877 05:04:36 nvme_xnvme -- xnvme/xnvme.sh@83 -- # method_bdev_xnvme_create_0["conserve_cpu"]=true 00:14:16.877 05:04:36 nvme_xnvme -- xnvme/xnvme.sh@84 -- # conserve_cpu=true 00:14:16.877 05:04:36 nvme_xnvme -- xnvme/xnvme.sh@86 -- # run_test xnvme_rpc xnvme_rpc 00:14:16.877 05:04:36 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:14:16.877 05:04:36 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:14:16.877 05:04:36 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:16.877 ************************************ 00:14:16.877 START TEST xnvme_rpc 00:14:16.877 ************************************ 00:14:16.877 05:04:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@1129 -- # xnvme_rpc 00:14:16.878 05:04:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@48 -- # cc=() 00:14:16.878 05:04:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@48 -- # local -A cc 00:14:16.878 05:04:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@50 -- # cc["false"]= 00:14:16.878 05:04:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@50 -- # cc["true"]=-c 00:14:16.878 05:04:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@53 -- # spdk_tgt=84667 00:14:16.878 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:14:16.878 05:04:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@54 -- # waitforlisten 84667 00:14:16.878 05:04:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@835 -- # '[' -z 84667 ']' 00:14:16.878 05:04:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:14:16.878 05:04:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@840 -- # local max_retries=100 00:14:16.878 05:04:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:14:16.878 05:04:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@844 -- # xtrace_disable 00:14:16.878 05:04:36 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@52 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:14:16.878 05:04:36 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:14:16.878 [2024-12-15 05:04:36.318875] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:14:16.878 [2024-12-15 05:04:36.319013] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid84667 ] 00:14:16.878 [2024-12-15 05:04:36.479378] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:16.878 [2024-12-15 05:04:36.508459] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@868 -- # return 0 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@56 -- # rpc_cmd bdev_xnvme_create /dev/ng0n1 xnvme_bdev io_uring_cmd -c 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:14:17.139 xnvme_bdev 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@62 -- # rpc_xnvme name 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.name' 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@62 -- # [[ xnvme_bdev == \x\n\v\m\e\_\b\d\e\v ]] 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@63 -- # rpc_xnvme filename 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.filename' 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@63 -- # [[ /dev/ng0n1 == \/\d\e\v\/\n\g\0\n\1 ]] 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@64 -- # rpc_xnvme io_mechanism 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.io_mechanism' 00:14:17.139 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@64 -- # [[ io_uring_cmd == \i\o\_\u\r\i\n\g\_\c\m\d ]] 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@65 -- # rpc_xnvme conserve_cpu 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@65 -- # rpc_cmd framework_get_config bdev 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/common.sh@66 -- # jq -r '.[] | select(.method == "bdev_xnvme_create").params.conserve_cpu' 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@65 -- # [[ true == \t\r\u\e ]] 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@67 -- # rpc_cmd bdev_xnvme_delete xnvme_bdev 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@563 -- # xtrace_disable 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- xnvme/xnvme.sh@70 -- # killprocess 84667 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@954 -- # '[' -z 84667 ']' 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@958 -- # kill -0 84667 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@959 -- # uname 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 84667 00:14:17.400 killing process with pid 84667 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@972 -- # echo 'killing process with pid 84667' 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@973 -- # kill 84667 00:14:17.400 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@978 -- # wait 84667 00:14:17.662 ************************************ 00:14:17.662 END TEST xnvme_rpc 00:14:17.662 ************************************ 00:14:17.662 00:14:17.662 real 0m1.415s 00:14:17.662 user 0m1.518s 00:14:17.662 sys 0m0.384s 00:14:17.662 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@1130 -- # xtrace_disable 00:14:17.662 05:04:37 nvme_xnvme.xnvme_rpc -- common/autotest_common.sh@10 -- # set +x 00:14:17.662 05:04:37 nvme_xnvme -- xnvme/xnvme.sh@87 -- # run_test xnvme_bdevperf xnvme_bdevperf 00:14:17.662 05:04:37 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:14:17.662 05:04:37 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:14:17.662 05:04:37 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:17.662 ************************************ 00:14:17.662 START TEST xnvme_bdevperf 00:14:17.662 ************************************ 00:14:17.662 05:04:37 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@1129 -- # xnvme_bdevperf 00:14:17.662 05:04:37 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@12 -- # local io_pattern 00:14:17.662 05:04:37 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@13 -- # local -n io_pattern_ref=io_uring_cmd 00:14:17.662 05:04:37 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:14:17.662 05:04:37 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w randread -t 5 -T xnvme_bdev -o 4096 00:14:17.662 05:04:37 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:14:17.662 05:04:37 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:14:17.662 05:04:37 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:14:17.662 { 00:14:17.662 "subsystems": [ 00:14:17.662 { 00:14:17.662 "subsystem": "bdev", 00:14:17.662 "config": [ 00:14:17.662 { 00:14:17.662 "params": { 00:14:17.662 "io_mechanism": "io_uring_cmd", 00:14:17.662 "conserve_cpu": true, 00:14:17.662 "filename": "/dev/ng0n1", 00:14:17.662 "name": "xnvme_bdev" 00:14:17.662 }, 00:14:17.662 "method": "bdev_xnvme_create" 00:14:17.662 }, 00:14:17.662 { 00:14:17.662 "method": "bdev_wait_for_examine" 00:14:17.662 } 00:14:17.662 ] 00:14:17.662 } 00:14:17.662 ] 00:14:17.662 } 00:14:17.662 [2024-12-15 05:04:37.783330] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:14:17.662 [2024-12-15 05:04:37.783676] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid84719 ] 00:14:17.923 [2024-12-15 05:04:37.946749] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:17.923 [2024-12-15 05:04:37.975467] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:14:18.183 Running I/O for 5 seconds... 00:14:20.070 36800.00 IOPS, 143.75 MiB/s [2024-12-15T05:04:41.152Z] 36320.00 IOPS, 141.88 MiB/s [2024-12-15T05:04:42.094Z] 36586.67 IOPS, 142.92 MiB/s [2024-12-15T05:04:43.481Z] 36512.00 IOPS, 142.62 MiB/s 00:14:23.341 Latency(us) 00:14:23.341 [2024-12-15T05:04:43.481Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:23.341 Job: xnvme_bdev (Core Mask 0x1, workload: randread, depth: 64, IO size: 4096) 00:14:23.341 xnvme_bdev : 5.00 36296.93 141.78 0.00 0.00 1758.70 863.31 4461.49 00:14:23.341 [2024-12-15T05:04:43.481Z] =================================================================================================================== 00:14:23.341 [2024-12-15T05:04:43.481Z] Total : 36296.93 141.78 0.00 0.00 1758.70 863.31 4461.49 00:14:23.341 05:04:43 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:14:23.341 05:04:43 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w randwrite -t 5 -T xnvme_bdev -o 4096 00:14:23.341 05:04:43 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:14:23.341 05:04:43 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:14:23.341 05:04:43 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:14:23.341 { 00:14:23.341 "subsystems": [ 00:14:23.341 { 00:14:23.341 "subsystem": "bdev", 00:14:23.341 "config": [ 00:14:23.341 { 00:14:23.341 "params": { 00:14:23.341 "io_mechanism": "io_uring_cmd", 00:14:23.341 "conserve_cpu": true, 00:14:23.341 "filename": "/dev/ng0n1", 00:14:23.341 "name": "xnvme_bdev" 00:14:23.341 }, 00:14:23.341 "method": "bdev_xnvme_create" 00:14:23.341 }, 00:14:23.341 { 00:14:23.341 "method": "bdev_wait_for_examine" 00:14:23.341 } 00:14:23.341 ] 00:14:23.341 } 00:14:23.341 ] 00:14:23.341 } 00:14:23.341 [2024-12-15 05:04:43.333720] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:14:23.341 [2024-12-15 05:04:43.333848] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid84788 ] 00:14:23.602 [2024-12-15 05:04:43.496878] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:23.602 [2024-12-15 05:04:43.525547] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:14:23.602 Running I/O for 5 seconds... 00:14:25.495 37613.00 IOPS, 146.93 MiB/s [2024-12-15T05:04:47.019Z] 37886.00 IOPS, 147.99 MiB/s [2024-12-15T05:04:47.963Z] 36880.00 IOPS, 144.06 MiB/s [2024-12-15T05:04:48.906Z] 29741.50 IOPS, 116.18 MiB/s [2024-12-15T05:04:48.906Z] 25195.00 IOPS, 98.42 MiB/s 00:14:28.766 Latency(us) 00:14:28.766 [2024-12-15T05:04:48.906Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:28.766 Job: xnvme_bdev (Core Mask 0x1, workload: randwrite, depth: 64, IO size: 4096) 00:14:28.766 xnvme_bdev : 5.02 25133.98 98.18 0.00 0.00 2538.72 74.83 37506.76 00:14:28.766 [2024-12-15T05:04:48.906Z] =================================================================================================================== 00:14:28.766 [2024-12-15T05:04:48.906Z] Total : 25133.98 98.18 0.00 0.00 2538.72 74.83 37506.76 00:14:28.766 05:04:48 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:14:28.766 05:04:48 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w unmap -t 5 -T xnvme_bdev -o 4096 00:14:28.766 05:04:48 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:14:28.766 05:04:48 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:14:28.766 05:04:48 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:14:28.766 { 00:14:28.766 "subsystems": [ 00:14:28.766 { 00:14:28.766 "subsystem": "bdev", 00:14:28.766 "config": [ 00:14:28.766 { 00:14:28.766 "params": { 00:14:28.766 "io_mechanism": "io_uring_cmd", 00:14:28.766 "conserve_cpu": true, 00:14:28.766 "filename": "/dev/ng0n1", 00:14:28.766 "name": "xnvme_bdev" 00:14:28.766 }, 00:14:28.766 "method": "bdev_xnvme_create" 00:14:28.766 }, 00:14:28.766 { 00:14:28.766 "method": "bdev_wait_for_examine" 00:14:28.766 } 00:14:28.766 ] 00:14:28.766 } 00:14:28.766 ] 00:14:28.766 } 00:14:28.766 [2024-12-15 05:04:48.898773] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:14:28.766 [2024-12-15 05:04:48.898920] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid84851 ] 00:14:29.069 [2024-12-15 05:04:49.064666] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:29.069 [2024-12-15 05:04:49.095720] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:14:29.354 Running I/O for 5 seconds... 00:14:31.239 72640.00 IOPS, 283.75 MiB/s [2024-12-15T05:04:52.321Z] 72576.00 IOPS, 283.50 MiB/s [2024-12-15T05:04:53.263Z] 75093.33 IOPS, 293.33 MiB/s [2024-12-15T05:04:54.647Z] 76224.00 IOPS, 297.75 MiB/s [2024-12-15T05:04:54.647Z] 78028.80 IOPS, 304.80 MiB/s 00:14:34.507 Latency(us) 00:14:34.507 [2024-12-15T05:04:54.647Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:34.507 Job: xnvme_bdev (Core Mask 0x1, workload: unmap, depth: 64, IO size: 4096) 00:14:34.507 xnvme_bdev : 5.00 78001.64 304.69 0.00 0.00 816.93 381.24 2848.30 00:14:34.507 [2024-12-15T05:04:54.647Z] =================================================================================================================== 00:14:34.507 [2024-12-15T05:04:54.647Z] Total : 78001.64 304.69 0.00 0.00 816.93 381.24 2848.30 00:14:34.507 05:04:54 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@15 -- # for io_pattern in "${io_pattern_ref[@]}" 00:14:34.507 05:04:54 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /dev/fd/62 -q 64 -w write_zeroes -t 5 -T xnvme_bdev -o 4096 00:14:34.507 05:04:54 nvme_xnvme.xnvme_bdevperf -- xnvme/xnvme.sh@17 -- # gen_conf 00:14:34.507 05:04:54 nvme_xnvme.xnvme_bdevperf -- dd/common.sh@31 -- # xtrace_disable 00:14:34.507 05:04:54 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:14:34.507 { 00:14:34.507 "subsystems": [ 00:14:34.507 { 00:14:34.507 "subsystem": "bdev", 00:14:34.507 "config": [ 00:14:34.507 { 00:14:34.507 "params": { 00:14:34.507 "io_mechanism": "io_uring_cmd", 00:14:34.507 "conserve_cpu": true, 00:14:34.507 "filename": "/dev/ng0n1", 00:14:34.507 "name": "xnvme_bdev" 00:14:34.507 }, 00:14:34.507 "method": "bdev_xnvme_create" 00:14:34.507 }, 00:14:34.507 { 00:14:34.507 "method": "bdev_wait_for_examine" 00:14:34.507 } 00:14:34.507 ] 00:14:34.507 } 00:14:34.507 ] 00:14:34.507 } 00:14:34.507 [2024-12-15 05:04:54.395959] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:14:34.507 [2024-12-15 05:04:54.396267] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid84920 ] 00:14:34.507 [2024-12-15 05:04:54.550721] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:34.507 [2024-12-15 05:04:54.571915] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:14:34.767 Running I/O for 5 seconds... 00:14:36.646 59291.00 IOPS, 231.61 MiB/s [2024-12-15T05:04:57.726Z] 53476.00 IOPS, 208.89 MiB/s [2024-12-15T05:04:58.667Z] 51397.00 IOPS, 200.77 MiB/s [2024-12-15T05:05:00.054Z] 49827.25 IOPS, 194.64 MiB/s [2024-12-15T05:05:00.054Z] 48165.20 IOPS, 188.15 MiB/s 00:14:39.914 Latency(us) 00:14:39.914 [2024-12-15T05:05:00.054Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:14:39.914 Job: xnvme_bdev (Core Mask 0x1, workload: write_zeroes, depth: 64, IO size: 4096) 00:14:39.914 xnvme_bdev : 5.01 48064.71 187.75 0.00 0.00 1325.91 187.47 17241.01 00:14:39.914 [2024-12-15T05:05:00.054Z] =================================================================================================================== 00:14:39.914 [2024-12-15T05:05:00.054Z] Total : 48064.71 187.75 0.00 0.00 1325.91 187.47 17241.01 00:14:39.914 00:14:39.914 real 0m22.134s 00:14:39.914 user 0m13.058s 00:14:39.914 sys 0m6.818s 00:14:39.914 05:04:59 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:14:39.914 05:04:59 nvme_xnvme.xnvme_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:14:39.914 ************************************ 00:14:39.914 END TEST xnvme_bdevperf 00:14:39.914 ************************************ 00:14:39.914 05:04:59 nvme_xnvme -- xnvme/xnvme.sh@88 -- # run_test xnvme_fio_plugin xnvme_fio_plugin 00:14:39.914 05:04:59 nvme_xnvme -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:14:39.914 05:04:59 nvme_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:14:39.914 05:04:59 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:39.914 ************************************ 00:14:39.914 START TEST xnvme_fio_plugin 00:14:39.914 ************************************ 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1129 -- # xnvme_fio_plugin 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@27 -- # local io_pattern 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@28 -- # local -n io_pattern_ref=io_uring_cmd_fio 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@30 -- # for io_pattern in "${io_pattern_ref[@]}" 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # fio_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # local sanitizers 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1345 -- # shift 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1347 -- # local asan_lib= 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # gen_conf 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- dd/common.sh@31 -- # xtrace_disable 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # grep libasan 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1351 -- # break 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:14:39.914 05:04:59 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randread --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:14:39.914 { 00:14:39.914 "subsystems": [ 00:14:39.914 { 00:14:39.914 "subsystem": "bdev", 00:14:39.914 "config": [ 00:14:39.914 { 00:14:39.914 "params": { 00:14:39.914 "io_mechanism": "io_uring_cmd", 00:14:39.914 "conserve_cpu": true, 00:14:39.914 "filename": "/dev/ng0n1", 00:14:39.914 "name": "xnvme_bdev" 00:14:39.914 }, 00:14:39.914 "method": "bdev_xnvme_create" 00:14:39.914 }, 00:14:39.914 { 00:14:39.914 "method": "bdev_wait_for_examine" 00:14:39.914 } 00:14:39.914 ] 00:14:39.914 } 00:14:39.914 ] 00:14:39.914 } 00:14:40.175 xnvme_bdev: (g=0): rw=randread, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=64 00:14:40.175 fio-3.35 00:14:40.175 Starting 1 thread 00:14:45.463 00:14:45.463 xnvme_bdev: (groupid=0, jobs=1): err= 0: pid=85027: Sun Dec 15 05:05:05 2024 00:14:45.463 read: IOPS=39.5k, BW=154MiB/s (162MB/s)(772MiB/5002msec) 00:14:45.463 slat (usec): min=2, max=118, avg= 3.69, stdev= 1.96 00:14:45.463 clat (usec): min=834, max=3677, avg=1472.24, stdev=268.38 00:14:45.463 lat (usec): min=837, max=3709, avg=1475.93, stdev=269.06 00:14:45.463 clat percentiles (usec): 00:14:45.463 | 1.00th=[ 1004], 5.00th=[ 1090], 10.00th=[ 1139], 20.00th=[ 1221], 00:14:45.463 | 30.00th=[ 1319], 40.00th=[ 1401], 50.00th=[ 1467], 60.00th=[ 1532], 00:14:45.463 | 70.00th=[ 1598], 80.00th=[ 1680], 90.00th=[ 1811], 95.00th=[ 1942], 00:14:45.463 | 99.00th=[ 2245], 99.50th=[ 2376], 99.90th=[ 2802], 99.95th=[ 2966], 00:14:45.463 | 99.99th=[ 3458] 00:14:45.463 bw ( KiB/s): min=143360, max=183296, per=97.99%, avg=154766.22, stdev=15826.06, samples=9 00:14:45.463 iops : min=35840, max=45824, avg=38691.56, stdev=3956.51, samples=9 00:14:45.463 lat (usec) : 1000=0.93% 00:14:45.463 lat (msec) : 2=95.60%, 4=3.47% 00:14:45.463 cpu : usr=56.97%, sys=39.79%, ctx=28, majf=0, minf=771 00:14:45.463 IO depths : 1=1.6%, 2=3.1%, 4=6.2%, 8=12.5%, 16=25.0%, 32=50.0%, >=64=1.6% 00:14:45.463 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:14:45.463 complete : 0=0.0%, 4=98.5%, 8=0.0%, 16=0.0%, 32=0.1%, 64=1.5%, >=64=0.0% 00:14:45.463 issued rwts: total=197504,0,0,0 short=0,0,0,0 dropped=0,0,0,0 00:14:45.463 latency : target=0, window=0, percentile=100.00%, depth=64 00:14:45.463 00:14:45.463 Run status group 0 (all jobs): 00:14:45.463 READ: bw=154MiB/s (162MB/s), 154MiB/s-154MiB/s (162MB/s-162MB/s), io=772MiB (809MB), run=5002-5002msec 00:14:46.034 ----------------------------------------------------- 00:14:46.034 Suppressions used: 00:14:46.034 count bytes template 00:14:46.034 1 11 /usr/src/fio/parse.c 00:14:46.034 1 8 libtcmalloc_minimal.so 00:14:46.034 1 904 libcrypto.so 00:14:46.034 ----------------------------------------------------- 00:14:46.034 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@30 -- # for io_pattern in "${io_pattern_ref[@]}" 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # fio_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1343 -- # local sanitizers 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1345 -- # shift 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1347 -- # local asan_lib= 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- xnvme/xnvme.sh@32 -- # gen_conf 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- dd/common.sh@31 -- # xtrace_disable 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # grep libasan 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1351 -- # break 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:14:46.034 05:05:05 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio --ioengine=spdk_bdev --spdk_json_conf=/dev/fd/62 --filename=xnvme_bdev --direct=1 --bs=4k --iodepth=64 --numjobs=1 --rw=randwrite --time_based --runtime=5 --thread=1 --name xnvme_bdev 00:14:46.034 { 00:14:46.034 "subsystems": [ 00:14:46.034 { 00:14:46.034 "subsystem": "bdev", 00:14:46.034 "config": [ 00:14:46.034 { 00:14:46.034 "params": { 00:14:46.034 "io_mechanism": "io_uring_cmd", 00:14:46.034 "conserve_cpu": true, 00:14:46.034 "filename": "/dev/ng0n1", 00:14:46.034 "name": "xnvme_bdev" 00:14:46.034 }, 00:14:46.034 "method": "bdev_xnvme_create" 00:14:46.034 }, 00:14:46.034 { 00:14:46.034 "method": "bdev_wait_for_examine" 00:14:46.034 } 00:14:46.034 ] 00:14:46.034 } 00:14:46.034 ] 00:14:46.034 } 00:14:46.034 xnvme_bdev: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=64 00:14:46.034 fio-3.35 00:14:46.034 Starting 1 thread 00:14:52.619 00:14:52.619 xnvme_bdev: (groupid=0, jobs=1): err= 0: pid=85101: Sun Dec 15 05:05:11 2024 00:14:52.619 write: IOPS=38.1k, BW=149MiB/s (156MB/s)(744MiB/5002msec); 0 zone resets 00:14:52.619 slat (usec): min=2, max=386, avg= 4.22, stdev= 2.51 00:14:52.619 clat (usec): min=287, max=6028, avg=1511.83, stdev=260.47 00:14:52.619 lat (usec): min=292, max=6031, avg=1516.04, stdev=261.05 00:14:52.619 clat percentiles (usec): 00:14:52.619 | 1.00th=[ 1057], 5.00th=[ 1172], 10.00th=[ 1221], 20.00th=[ 1303], 00:14:52.619 | 30.00th=[ 1369], 40.00th=[ 1418], 50.00th=[ 1483], 60.00th=[ 1532], 00:14:52.619 | 70.00th=[ 1598], 80.00th=[ 1696], 90.00th=[ 1827], 95.00th=[ 1958], 00:14:52.619 | 99.00th=[ 2311], 99.50th=[ 2507], 99.90th=[ 3228], 99.95th=[ 3589], 00:14:52.619 | 99.99th=[ 4293] 00:14:52.619 bw ( KiB/s): min=147616, max=163600, per=100.00%, avg=152630.22, stdev=4830.81, samples=9 00:14:52.619 iops : min=36904, max=40900, avg=38157.56, stdev=1207.70, samples=9 00:14:52.619 lat (usec) : 500=0.01%, 750=0.01%, 1000=0.35% 00:14:52.619 lat (msec) : 2=95.67%, 4=3.95%, 10=0.02% 00:14:52.619 cpu : usr=45.89%, sys=48.89%, ctx=8, majf=0, minf=772 00:14:52.619 IO depths : 1=1.4%, 2=3.0%, 4=6.1%, 8=12.4%, 16=25.0%, 32=50.4%, >=64=1.7% 00:14:52.619 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:14:52.619 complete : 0=0.0%, 4=98.4%, 8=0.1%, 16=0.1%, 32=0.1%, 64=1.5%, >=64=0.0% 00:14:52.619 issued rwts: total=0,190391,0,0 short=0,0,0,0 dropped=0,0,0,0 00:14:52.619 latency : target=0, window=0, percentile=100.00%, depth=64 00:14:52.619 00:14:52.619 Run status group 0 (all jobs): 00:14:52.619 WRITE: bw=149MiB/s (156MB/s), 149MiB/s-149MiB/s (156MB/s-156MB/s), io=744MiB (780MB), run=5002-5002msec 00:14:52.619 ----------------------------------------------------- 00:14:52.619 Suppressions used: 00:14:52.619 count bytes template 00:14:52.619 1 11 /usr/src/fio/parse.c 00:14:52.619 1 8 libtcmalloc_minimal.so 00:14:52.619 1 904 libcrypto.so 00:14:52.619 ----------------------------------------------------- 00:14:52.619 00:14:52.619 ************************************ 00:14:52.619 END TEST xnvme_fio_plugin 00:14:52.619 ************************************ 00:14:52.619 00:14:52.619 real 0m12.046s 00:14:52.619 user 0m6.282s 00:14:52.619 sys 0m5.029s 00:14:52.619 05:05:11 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@1130 -- # xtrace_disable 00:14:52.619 05:05:11 nvme_xnvme.xnvme_fio_plugin -- common/autotest_common.sh@10 -- # set +x 00:14:52.619 Process with pid 84667 is not found 00:14:52.619 05:05:12 nvme_xnvme -- xnvme/xnvme.sh@1 -- # killprocess 84667 00:14:52.619 05:05:12 nvme_xnvme -- common/autotest_common.sh@954 -- # '[' -z 84667 ']' 00:14:52.619 05:05:12 nvme_xnvme -- common/autotest_common.sh@958 -- # kill -0 84667 00:14:52.619 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 958: kill: (84667) - No such process 00:14:52.619 05:05:12 nvme_xnvme -- common/autotest_common.sh@981 -- # echo 'Process with pid 84667 is not found' 00:14:52.619 ************************************ 00:14:52.619 END TEST nvme_xnvme 00:14:52.619 ************************************ 00:14:52.619 00:14:52.619 real 2m57.944s 00:14:52.619 user 1m20.700s 00:14:52.619 sys 1m22.758s 00:14:52.619 05:05:12 nvme_xnvme -- common/autotest_common.sh@1130 -- # xtrace_disable 00:14:52.619 05:05:12 nvme_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:52.619 05:05:12 -- spdk/autotest.sh@245 -- # run_test blockdev_xnvme /home/vagrant/spdk_repo/spdk/test/bdev/blockdev.sh xnvme 00:14:52.619 05:05:12 -- common/autotest_common.sh@1105 -- # '[' 3 -le 1 ']' 00:14:52.619 05:05:12 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:14:52.619 05:05:12 -- common/autotest_common.sh@10 -- # set +x 00:14:52.619 ************************************ 00:14:52.619 START TEST blockdev_xnvme 00:14:52.619 ************************************ 00:14:52.619 05:05:12 blockdev_xnvme -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/bdev/blockdev.sh xnvme 00:14:52.619 * Looking for test storage... 00:14:52.619 * Found test storage at /home/vagrant/spdk_repo/spdk/test/bdev 00:14:52.619 05:05:12 blockdev_xnvme -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:14:52.619 05:05:12 blockdev_xnvme -- common/autotest_common.sh@1711 -- # lcov --version 00:14:52.619 05:05:12 blockdev_xnvme -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:14:52.620 05:05:12 blockdev_xnvme -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@333 -- # local ver1 ver1_l 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@334 -- # local ver2 ver2_l 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@336 -- # IFS=.-: 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@336 -- # read -ra ver1 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@337 -- # IFS=.-: 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@337 -- # read -ra ver2 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@338 -- # local 'op=<' 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@340 -- # ver1_l=2 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@341 -- # ver2_l=1 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@344 -- # case "$op" in 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@345 -- # : 1 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@364 -- # (( v = 0 )) 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@365 -- # decimal 1 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@353 -- # local d=1 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@355 -- # echo 1 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@365 -- # ver1[v]=1 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@366 -- # decimal 2 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@353 -- # local d=2 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@355 -- # echo 2 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@366 -- # ver2[v]=2 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:14:52.620 05:05:12 blockdev_xnvme -- scripts/common.sh@368 -- # return 0 00:14:52.620 05:05:12 blockdev_xnvme -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:14:52.620 05:05:12 blockdev_xnvme -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:14:52.620 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:14:52.620 --rc genhtml_branch_coverage=1 00:14:52.620 --rc genhtml_function_coverage=1 00:14:52.620 --rc genhtml_legend=1 00:14:52.620 --rc geninfo_all_blocks=1 00:14:52.620 --rc geninfo_unexecuted_blocks=1 00:14:52.620 00:14:52.620 ' 00:14:52.620 05:05:12 blockdev_xnvme -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:14:52.620 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:14:52.620 --rc genhtml_branch_coverage=1 00:14:52.620 --rc genhtml_function_coverage=1 00:14:52.620 --rc genhtml_legend=1 00:14:52.620 --rc geninfo_all_blocks=1 00:14:52.620 --rc geninfo_unexecuted_blocks=1 00:14:52.620 00:14:52.620 ' 00:14:52.620 05:05:12 blockdev_xnvme -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:14:52.620 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:14:52.620 --rc genhtml_branch_coverage=1 00:14:52.620 --rc genhtml_function_coverage=1 00:14:52.620 --rc genhtml_legend=1 00:14:52.620 --rc geninfo_all_blocks=1 00:14:52.620 --rc geninfo_unexecuted_blocks=1 00:14:52.620 00:14:52.620 ' 00:14:52.620 05:05:12 blockdev_xnvme -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:14:52.620 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:14:52.620 --rc genhtml_branch_coverage=1 00:14:52.620 --rc genhtml_function_coverage=1 00:14:52.620 --rc genhtml_legend=1 00:14:52.620 --rc geninfo_all_blocks=1 00:14:52.620 --rc geninfo_unexecuted_blocks=1 00:14:52.620 00:14:52.620 ' 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@10 -- # source /home/vagrant/spdk_repo/spdk/test/bdev/nbd_common.sh 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/nbd_common.sh@6 -- # set -e 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@13 -- # conf_file=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/home/vagrant/spdk_repo/spdk/test/bdev/nonenclosed.json 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/home/vagrant/spdk_repo/spdk/test/bdev/nonarray.json 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@20 -- # : 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@707 -- # QOS_DEV_1=Malloc_0 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@708 -- # QOS_DEV_2=Null_1 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@709 -- # QOS_RUN_TIME=5 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@711 -- # uname -s 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@711 -- # '[' Linux = Linux ']' 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@713 -- # PRE_RESERVED_MEM=0 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@719 -- # test_type=xnvme 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@720 -- # crypto_device= 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@721 -- # dek= 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@722 -- # env_ctx= 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@723 -- # wait_for_rpc= 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@724 -- # '[' -n '' ']' 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@727 -- # [[ xnvme == bdev ]] 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@727 -- # [[ xnvme == crypto_* ]] 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@730 -- # start_spdk_tgt 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=85230 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@49 -- # waitforlisten 85230 00:14:52.620 05:05:12 blockdev_xnvme -- bdev/blockdev.sh@46 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt '' '' 00:14:52.620 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:14:52.620 05:05:12 blockdev_xnvme -- common/autotest_common.sh@835 -- # '[' -z 85230 ']' 00:14:52.620 05:05:12 blockdev_xnvme -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:14:52.620 05:05:12 blockdev_xnvme -- common/autotest_common.sh@840 -- # local max_retries=100 00:14:52.620 05:05:12 blockdev_xnvme -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:14:52.620 05:05:12 blockdev_xnvme -- common/autotest_common.sh@844 -- # xtrace_disable 00:14:52.620 05:05:12 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:52.620 [2024-12-15 05:05:12.323261] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:14:52.620 [2024-12-15 05:05:12.323596] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid85230 ] 00:14:52.620 [2024-12-15 05:05:12.483994] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:52.620 [2024-12-15 05:05:12.513079] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:14:53.191 05:05:13 blockdev_xnvme -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:14:53.191 05:05:13 blockdev_xnvme -- common/autotest_common.sh@868 -- # return 0 00:14:53.191 05:05:13 blockdev_xnvme -- bdev/blockdev.sh@731 -- # case "$test_type" in 00:14:53.191 05:05:13 blockdev_xnvme -- bdev/blockdev.sh@766 -- # setup_xnvme_conf 00:14:53.191 05:05:13 blockdev_xnvme -- bdev/blockdev.sh@88 -- # local io_mechanism=io_uring 00:14:53.191 05:05:13 blockdev_xnvme -- bdev/blockdev.sh@89 -- # local nvme nvmes 00:14:53.191 05:05:13 blockdev_xnvme -- bdev/blockdev.sh@91 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:14:53.764 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:14:54.025 0000:00:11.0 (1b36 0010): Already using the nvme driver 00:14:54.025 0000:00:10.0 (1b36 0010): Already using the nvme driver 00:14:54.287 0000:00:12.0 (1b36 0010): Already using the nvme driver 00:14:54.287 0000:00:13.0 (1b36 0010): Already using the nvme driver 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@92 -- # get_zoned_devs 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1657 -- # zoned_devs=() 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1657 -- # local -gA zoned_devs 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1658 -- # zoned_ctrls=() 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1658 -- # local -A zoned_ctrls 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1659 -- # local nvme bdf ns 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1669 -- # bdf=0000:00:12.0 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1671 -- # is_block_zoned nvme0n1 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1650 -- # local device=nvme0n1 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1671 -- # is_block_zoned nvme0n2 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1650 -- # local device=nvme0n2 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n2/queue/zoned ]] 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1671 -- # is_block_zoned nvme0n3 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1650 -- # local device=nvme0n3 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme0n3/queue/zoned ]] 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1669 -- # bdf=0000:00:13.0 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1671 -- # is_block_zoned nvme1c1n1 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1650 -- # local device=nvme1c1n1 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme1c1n1/queue/zoned ]] 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1669 -- # bdf=0000:00:10.0 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1671 -- # is_block_zoned nvme2n1 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1650 -- # local device=nvme2n1 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme2n1/queue/zoned ]] 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1668 -- # for nvme in /sys/class/nvme/nvme* 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1669 -- # bdf=0000:00:11.0 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1670 -- # for ns in "$nvme/"nvme*n* 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1671 -- # is_block_zoned nvme3n1 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1650 -- # local device=nvme3n1 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1652 -- # [[ -e /sys/block/nvme3n1/queue/zoned ]] 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1653 -- # [[ none != none ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@94 -- # for nvme in /dev/nvme*n* 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@95 -- # [[ -b /dev/nvme0n1 ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@95 -- # [[ -z '' ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@96 -- # nvmes+=("bdev_xnvme_create $nvme ${nvme##*/} $io_mechanism -c") 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@94 -- # for nvme in /dev/nvme*n* 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@95 -- # [[ -b /dev/nvme0n2 ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@95 -- # [[ -z '' ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@96 -- # nvmes+=("bdev_xnvme_create $nvme ${nvme##*/} $io_mechanism -c") 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@94 -- # for nvme in /dev/nvme*n* 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@95 -- # [[ -b /dev/nvme0n3 ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@95 -- # [[ -z '' ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@96 -- # nvmes+=("bdev_xnvme_create $nvme ${nvme##*/} $io_mechanism -c") 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@94 -- # for nvme in /dev/nvme*n* 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@95 -- # [[ -b /dev/nvme1n1 ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@95 -- # [[ -z '' ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@96 -- # nvmes+=("bdev_xnvme_create $nvme ${nvme##*/} $io_mechanism -c") 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@94 -- # for nvme in /dev/nvme*n* 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@95 -- # [[ -b /dev/nvme2n1 ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@95 -- # [[ -z '' ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@96 -- # nvmes+=("bdev_xnvme_create $nvme ${nvme##*/} $io_mechanism -c") 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@94 -- # for nvme in /dev/nvme*n* 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@95 -- # [[ -b /dev/nvme3n1 ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@95 -- # [[ -z '' ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@96 -- # nvmes+=("bdev_xnvme_create $nvme ${nvme##*/} $io_mechanism -c") 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@99 -- # (( 6 > 0 )) 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@100 -- # rpc_cmd 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@563 -- # xtrace_disable 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@100 -- # printf '%s\n' 'bdev_xnvme_create /dev/nvme0n1 nvme0n1 io_uring -c' 'bdev_xnvme_create /dev/nvme0n2 nvme0n2 io_uring -c' 'bdev_xnvme_create /dev/nvme0n3 nvme0n3 io_uring -c' 'bdev_xnvme_create /dev/nvme1n1 nvme1n1 io_uring -c' 'bdev_xnvme_create /dev/nvme2n1 nvme2n1 io_uring -c' 'bdev_xnvme_create /dev/nvme3n1 nvme3n1 io_uring -c' 00:14:54.287 nvme0n1 00:14:54.287 nvme0n2 00:14:54.287 nvme0n3 00:14:54.287 nvme1n1 00:14:54.287 nvme2n1 00:14:54.287 nvme3n1 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@774 -- # rpc_cmd bdev_wait_for_examine 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@563 -- # xtrace_disable 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@777 -- # cat 00:14:54.287 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@777 -- # rpc_cmd save_subsystem_config -n accel 00:14:54.287 05:05:14 blockdev_xnvme -- common/autotest_common.sh@563 -- # xtrace_disable 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:14:54.288 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@777 -- # rpc_cmd save_subsystem_config -n bdev 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@563 -- # xtrace_disable 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:14:54.288 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@777 -- # rpc_cmd save_subsystem_config -n iobuf 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@563 -- # xtrace_disable 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:14:54.288 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@785 -- # mapfile -t bdevs 00:14:54.288 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@785 -- # rpc_cmd bdev_get_bdevs 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@563 -- # xtrace_disable 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:54.288 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@785 -- # jq -r '.[] | select(.claimed == false)' 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:14:54.288 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@786 -- # mapfile -t bdevs_name 00:14:54.288 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@786 -- # printf '%s\n' '{' ' "name": "nvme0n1",' ' "aliases": [' ' "b7cd1c7e-35d8-49b3-bd6e-990b13a1a8d1"' ' ],' ' "product_name": "xNVMe bdev",' ' "block_size": 4096,' ' "num_blocks": 1048576,' ' "uuid": "b7cd1c7e-35d8-49b3-bd6e-990b13a1a8d1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": false,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {}' '}' '{' ' "name": "nvme0n2",' ' "aliases": [' ' "c1707fbd-5af2-4521-8bf2-5fdb1d8d6215"' ' ],' ' "product_name": "xNVMe bdev",' ' "block_size": 4096,' ' "num_blocks": 1048576,' ' "uuid": "c1707fbd-5af2-4521-8bf2-5fdb1d8d6215",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": false,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {}' '}' '{' ' "name": "nvme0n3",' ' "aliases": [' ' "beae6bab-c0ea-4531-bc65-4d27e14bc31c"' ' ],' ' "product_name": "xNVMe bdev",' ' "block_size": 4096,' ' "num_blocks": 1048576,' ' "uuid": "beae6bab-c0ea-4531-bc65-4d27e14bc31c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": false,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {}' '}' '{' ' "name": "nvme1n1",' ' "aliases": [' ' "35a7846c-bb98-4117-8772-a9c853d4f587"' ' ],' ' "product_name": "xNVMe bdev",' ' "block_size": 4096,' ' "num_blocks": 262144,' ' "uuid": "35a7846c-bb98-4117-8772-a9c853d4f587",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": false,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {}' '}' '{' ' "name": "nvme2n1",' ' "aliases": [' 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@786 -- # jq -r .name 00:14:54.288 ' "f099eb83-0b1e-44cf-8d11-379b378b416b"' ' ],' ' "product_name": "xNVMe bdev",' ' "block_size": 4096,' ' "num_blocks": 1548666,' ' "uuid": "f099eb83-0b1e-44cf-8d11-379b378b416b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": false,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {}' '}' '{' ' "name": "nvme3n1",' ' "aliases": [' ' "e1d549eb-3687-4b4e-a157-3e367f0b8027"' ' ],' ' "product_name": "xNVMe bdev",' ' "block_size": 4096,' ' "num_blocks": 1310720,' ' "uuid": "e1d549eb-3687-4b4e-a157-3e367f0b8027",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": false,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {}' '}' 00:14:54.288 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@787 -- # bdev_list=("${bdevs_name[@]}") 00:14:54.288 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@789 -- # hello_world_bdev=nvme0n1 00:14:54.288 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@790 -- # trap - SIGINT SIGTERM EXIT 00:14:54.288 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@791 -- # killprocess 85230 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@954 -- # '[' -z 85230 ']' 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@958 -- # kill -0 85230 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@959 -- # uname 00:14:54.288 05:05:14 blockdev_xnvme -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:14:54.550 05:05:14 blockdev_xnvme -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 85230 00:14:54.550 killing process with pid 85230 00:14:54.550 05:05:14 blockdev_xnvme -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:14:54.550 05:05:14 blockdev_xnvme -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:14:54.550 05:05:14 blockdev_xnvme -- common/autotest_common.sh@972 -- # echo 'killing process with pid 85230' 00:14:54.550 05:05:14 blockdev_xnvme -- common/autotest_common.sh@973 -- # kill 85230 00:14:54.550 05:05:14 blockdev_xnvme -- common/autotest_common.sh@978 -- # wait 85230 00:14:54.810 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@795 -- # trap cleanup SIGINT SIGTERM EXIT 00:14:54.810 05:05:14 blockdev_xnvme -- bdev/blockdev.sh@797 -- # run_test bdev_hello_world /home/vagrant/spdk_repo/spdk/build/examples/hello_bdev --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -b nvme0n1 '' 00:14:54.810 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1105 -- # '[' 7 -le 1 ']' 00:14:54.810 05:05:14 blockdev_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:14:54.810 05:05:14 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:54.810 ************************************ 00:14:54.810 START TEST bdev_hello_world 00:14:54.810 ************************************ 00:14:54.810 05:05:14 blockdev_xnvme.bdev_hello_world -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/hello_bdev --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -b nvme0n1 '' 00:14:54.810 [2024-12-15 05:05:14.840281] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:14:54.810 [2024-12-15 05:05:14.840830] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid85503 ] 00:14:55.072 [2024-12-15 05:05:15.003172] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:55.072 [2024-12-15 05:05:15.032078] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:14:55.332 [2024-12-15 05:05:15.258691] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:14:55.332 [2024-12-15 05:05:15.258751] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev nvme0n1 00:14:55.332 [2024-12-15 05:05:15.258775] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:14:55.332 [2024-12-15 05:05:15.261036] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:14:55.332 [2024-12-15 05:05:15.261620] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:14:55.333 [2024-12-15 05:05:15.261649] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:14:55.333 [2024-12-15 05:05:15.262226] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:14:55.333 00:14:55.333 [2024-12-15 05:05:15.262329] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:14:55.333 ************************************ 00:14:55.333 END TEST bdev_hello_world 00:14:55.333 ************************************ 00:14:55.333 00:14:55.333 real 0m0.675s 00:14:55.333 user 0m0.328s 00:14:55.333 sys 0m0.202s 00:14:55.333 05:05:15 blockdev_xnvme.bdev_hello_world -- common/autotest_common.sh@1130 -- # xtrace_disable 00:14:55.333 05:05:15 blockdev_xnvme.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:14:55.594 05:05:15 blockdev_xnvme -- bdev/blockdev.sh@798 -- # run_test bdev_bounds bdev_bounds '' 00:14:55.594 05:05:15 blockdev_xnvme -- common/autotest_common.sh@1105 -- # '[' 3 -le 1 ']' 00:14:55.594 05:05:15 blockdev_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:14:55.594 05:05:15 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:55.594 ************************************ 00:14:55.594 START TEST bdev_bounds 00:14:55.594 ************************************ 00:14:55.594 05:05:15 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@1129 -- # bdev_bounds '' 00:14:55.594 05:05:15 blockdev_xnvme.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=85523 00:14:55.594 05:05:15 blockdev_xnvme.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:14:55.594 Process bdevio pid: 85523 00:14:55.594 05:05:15 blockdev_xnvme.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 85523' 00:14:55.594 05:05:15 blockdev_xnvme.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 85523 00:14:55.594 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:14:55.594 05:05:15 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@835 -- # '[' -z 85523 ']' 00:14:55.594 05:05:15 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:14:55.594 05:05:15 blockdev_xnvme.bdev_bounds -- bdev/blockdev.sh@288 -- # /home/vagrant/spdk_repo/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json '' 00:14:55.594 05:05:15 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@840 -- # local max_retries=100 00:14:55.594 05:05:15 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:14:55.594 05:05:15 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@844 -- # xtrace_disable 00:14:55.594 05:05:15 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:14:55.594 [2024-12-15 05:05:15.589986] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:14:55.594 [2024-12-15 05:05:15.590338] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid85523 ] 00:14:55.857 [2024-12-15 05:05:15.751963] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:14:55.858 [2024-12-15 05:05:15.783902] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:14:55.858 [2024-12-15 05:05:15.784489] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:14:55.858 [2024-12-15 05:05:15.784520] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 2 00:14:56.430 05:05:16 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:14:56.430 05:05:16 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@868 -- # return 0 00:14:56.430 05:05:16 blockdev_xnvme.bdev_bounds -- bdev/blockdev.sh@293 -- # /home/vagrant/spdk_repo/spdk/test/bdev/bdevio/tests.py perform_tests 00:14:56.430 I/O targets: 00:14:56.430 nvme0n1: 1048576 blocks of 4096 bytes (4096 MiB) 00:14:56.430 nvme0n2: 1048576 blocks of 4096 bytes (4096 MiB) 00:14:56.430 nvme0n3: 1048576 blocks of 4096 bytes (4096 MiB) 00:14:56.430 nvme1n1: 262144 blocks of 4096 bytes (1024 MiB) 00:14:56.430 nvme2n1: 1548666 blocks of 4096 bytes (6050 MiB) 00:14:56.430 nvme3n1: 1310720 blocks of 4096 bytes (5120 MiB) 00:14:56.430 00:14:56.430 00:14:56.430 CUnit - A unit testing framework for C - Version 2.1-3 00:14:56.430 http://cunit.sourceforge.net/ 00:14:56.430 00:14:56.430 00:14:56.430 Suite: bdevio tests on: nvme3n1 00:14:56.430 Test: blockdev write read block ...passed 00:14:56.430 Test: blockdev write zeroes read block ...passed 00:14:56.430 Test: blockdev write zeroes read no split ...passed 00:14:56.700 Test: blockdev write zeroes read split ...passed 00:14:56.700 Test: blockdev write zeroes read split partial ...passed 00:14:56.700 Test: blockdev reset ...passed 00:14:56.700 Test: blockdev write read 8 blocks ...passed 00:14:56.700 Test: blockdev write read size > 128k ...passed 00:14:56.700 Test: blockdev write read invalid size ...passed 00:14:56.700 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:14:56.700 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:14:56.700 Test: blockdev write read max offset ...passed 00:14:56.700 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:14:56.700 Test: blockdev writev readv 8 blocks ...passed 00:14:56.700 Test: blockdev writev readv 30 x 1block ...passed 00:14:56.700 Test: blockdev writev readv block ...passed 00:14:56.700 Test: blockdev writev readv size > 128k ...passed 00:14:56.700 Test: blockdev writev readv size > 128k in two iovs ...passed 00:14:56.700 Test: blockdev comparev and writev ...passed 00:14:56.700 Test: blockdev nvme passthru rw ...passed 00:14:56.700 Test: blockdev nvme passthru vendor specific ...passed 00:14:56.700 Test: blockdev nvme admin passthru ...passed 00:14:56.700 Test: blockdev copy ...passed 00:14:56.700 Suite: bdevio tests on: nvme2n1 00:14:56.700 Test: blockdev write read block ...passed 00:14:56.700 Test: blockdev write zeroes read block ...passed 00:14:56.700 Test: blockdev write zeroes read no split ...passed 00:14:56.700 Test: blockdev write zeroes read split ...passed 00:14:56.700 Test: blockdev write zeroes read split partial ...passed 00:14:56.700 Test: blockdev reset ...passed 00:14:56.700 Test: blockdev write read 8 blocks ...passed 00:14:56.700 Test: blockdev write read size > 128k ...passed 00:14:56.700 Test: blockdev write read invalid size ...passed 00:14:56.700 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:14:56.700 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:14:56.700 Test: blockdev write read max offset ...passed 00:14:56.701 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:14:56.701 Test: blockdev writev readv 8 blocks ...passed 00:14:56.701 Test: blockdev writev readv 30 x 1block ...passed 00:14:56.701 Test: blockdev writev readv block ...passed 00:14:56.701 Test: blockdev writev readv size > 128k ...passed 00:14:56.701 Test: blockdev writev readv size > 128k in two iovs ...passed 00:14:56.701 Test: blockdev comparev and writev ...passed 00:14:56.701 Test: blockdev nvme passthru rw ...passed 00:14:56.701 Test: blockdev nvme passthru vendor specific ...passed 00:14:56.701 Test: blockdev nvme admin passthru ...passed 00:14:56.701 Test: blockdev copy ...passed 00:14:56.701 Suite: bdevio tests on: nvme1n1 00:14:56.701 Test: blockdev write read block ...passed 00:14:56.701 Test: blockdev write zeroes read block ...passed 00:14:56.701 Test: blockdev write zeroes read no split ...passed 00:14:56.701 Test: blockdev write zeroes read split ...passed 00:14:56.701 Test: blockdev write zeroes read split partial ...passed 00:14:56.701 Test: blockdev reset ...passed 00:14:56.701 Test: blockdev write read 8 blocks ...passed 00:14:56.701 Test: blockdev write read size > 128k ...passed 00:14:56.701 Test: blockdev write read invalid size ...passed 00:14:56.701 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:14:56.701 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:14:56.701 Test: blockdev write read max offset ...passed 00:14:56.701 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:14:56.701 Test: blockdev writev readv 8 blocks ...passed 00:14:56.701 Test: blockdev writev readv 30 x 1block ...passed 00:14:56.701 Test: blockdev writev readv block ...passed 00:14:56.701 Test: blockdev writev readv size > 128k ...passed 00:14:56.701 Test: blockdev writev readv size > 128k in two iovs ...passed 00:14:56.701 Test: blockdev comparev and writev ...passed 00:14:56.701 Test: blockdev nvme passthru rw ...passed 00:14:56.701 Test: blockdev nvme passthru vendor specific ...passed 00:14:56.701 Test: blockdev nvme admin passthru ...passed 00:14:56.701 Test: blockdev copy ...passed 00:14:56.701 Suite: bdevio tests on: nvme0n3 00:14:56.701 Test: blockdev write read block ...passed 00:14:56.701 Test: blockdev write zeroes read block ...passed 00:14:56.701 Test: blockdev write zeroes read no split ...passed 00:14:56.701 Test: blockdev write zeroes read split ...passed 00:14:56.701 Test: blockdev write zeroes read split partial ...passed 00:14:56.701 Test: blockdev reset ...passed 00:14:56.701 Test: blockdev write read 8 blocks ...passed 00:14:56.701 Test: blockdev write read size > 128k ...passed 00:14:56.701 Test: blockdev write read invalid size ...passed 00:14:56.701 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:14:56.701 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:14:56.701 Test: blockdev write read max offset ...passed 00:14:56.701 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:14:56.701 Test: blockdev writev readv 8 blocks ...passed 00:14:56.701 Test: blockdev writev readv 30 x 1block ...passed 00:14:56.701 Test: blockdev writev readv block ...passed 00:14:56.701 Test: blockdev writev readv size > 128k ...passed 00:14:56.701 Test: blockdev writev readv size > 128k in two iovs ...passed 00:14:56.701 Test: blockdev comparev and writev ...passed 00:14:56.701 Test: blockdev nvme passthru rw ...passed 00:14:56.701 Test: blockdev nvme passthru vendor specific ...passed 00:14:56.701 Test: blockdev nvme admin passthru ...passed 00:14:56.701 Test: blockdev copy ...passed 00:14:56.701 Suite: bdevio tests on: nvme0n2 00:14:56.701 Test: blockdev write read block ...passed 00:14:56.701 Test: blockdev write zeroes read block ...passed 00:14:56.701 Test: blockdev write zeroes read no split ...passed 00:14:56.701 Test: blockdev write zeroes read split ...passed 00:14:56.701 Test: blockdev write zeroes read split partial ...passed 00:14:56.701 Test: blockdev reset ...passed 00:14:56.701 Test: blockdev write read 8 blocks ...passed 00:14:56.701 Test: blockdev write read size > 128k ...passed 00:14:56.701 Test: blockdev write read invalid size ...passed 00:14:56.701 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:14:56.701 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:14:56.701 Test: blockdev write read max offset ...passed 00:14:56.701 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:14:56.701 Test: blockdev writev readv 8 blocks ...passed 00:14:56.701 Test: blockdev writev readv 30 x 1block ...passed 00:14:56.701 Test: blockdev writev readv block ...passed 00:14:56.701 Test: blockdev writev readv size > 128k ...passed 00:14:56.701 Test: blockdev writev readv size > 128k in two iovs ...passed 00:14:56.701 Test: blockdev comparev and writev ...passed 00:14:56.701 Test: blockdev nvme passthru rw ...passed 00:14:56.701 Test: blockdev nvme passthru vendor specific ...passed 00:14:56.701 Test: blockdev nvme admin passthru ...passed 00:14:56.701 Test: blockdev copy ...passed 00:14:56.701 Suite: bdevio tests on: nvme0n1 00:14:56.701 Test: blockdev write read block ...passed 00:14:56.701 Test: blockdev write zeroes read block ...passed 00:14:56.701 Test: blockdev write zeroes read no split ...passed 00:14:56.701 Test: blockdev write zeroes read split ...passed 00:14:56.701 Test: blockdev write zeroes read split partial ...passed 00:14:56.701 Test: blockdev reset ...passed 00:14:56.701 Test: blockdev write read 8 blocks ...passed 00:14:56.701 Test: blockdev write read size > 128k ...passed 00:14:56.701 Test: blockdev write read invalid size ...passed 00:14:56.701 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:14:56.701 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:14:56.701 Test: blockdev write read max offset ...passed 00:14:56.701 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:14:56.701 Test: blockdev writev readv 8 blocks ...passed 00:14:56.701 Test: blockdev writev readv 30 x 1block ...passed 00:14:56.701 Test: blockdev writev readv block ...passed 00:14:56.701 Test: blockdev writev readv size > 128k ...passed 00:14:56.701 Test: blockdev writev readv size > 128k in two iovs ...passed 00:14:56.994 Test: blockdev comparev and writev ...passed 00:14:56.994 Test: blockdev nvme passthru rw ...passed 00:14:56.994 Test: blockdev nvme passthru vendor specific ...passed 00:14:56.994 Test: blockdev nvme admin passthru ...passed 00:14:56.994 Test: blockdev copy ...passed 00:14:56.994 00:14:56.994 Run Summary: Type Total Ran Passed Failed Inactive 00:14:56.994 suites 6 6 n/a 0 0 00:14:56.994 tests 138 138 138 0 0 00:14:56.994 asserts 780 780 780 0 n/a 00:14:56.994 00:14:56.994 Elapsed time = 0.624 seconds 00:14:56.994 0 00:14:56.994 05:05:16 blockdev_xnvme.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 85523 00:14:56.994 05:05:16 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@954 -- # '[' -z 85523 ']' 00:14:56.994 05:05:16 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@958 -- # kill -0 85523 00:14:56.994 05:05:16 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@959 -- # uname 00:14:56.994 05:05:16 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:14:56.994 05:05:16 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 85523 00:14:56.994 05:05:16 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:14:56.994 05:05:16 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:14:56.994 05:05:16 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@972 -- # echo 'killing process with pid 85523' 00:14:56.994 killing process with pid 85523 00:14:56.994 05:05:16 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@973 -- # kill 85523 00:14:56.994 05:05:16 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@978 -- # wait 85523 00:14:56.994 05:05:17 blockdev_xnvme.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:14:56.994 00:14:56.994 real 0m1.549s 00:14:56.994 user 0m3.792s 00:14:56.994 sys 0m0.334s 00:14:56.994 05:05:17 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@1130 -- # xtrace_disable 00:14:56.994 05:05:17 blockdev_xnvme.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:14:56.994 ************************************ 00:14:56.994 END TEST bdev_bounds 00:14:56.994 ************************************ 00:14:57.262 05:05:17 blockdev_xnvme -- bdev/blockdev.sh@799 -- # run_test bdev_nbd nbd_function_test /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 'nvme0n1 nvme0n2 nvme0n3 nvme1n1 nvme2n1 nvme3n1' '' 00:14:57.262 05:05:17 blockdev_xnvme -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:14:57.262 05:05:17 blockdev_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:14:57.262 05:05:17 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:14:57.262 ************************************ 00:14:57.262 START TEST bdev_nbd 00:14:57.262 ************************************ 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@1129 -- # nbd_function_test /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 'nvme0n1 nvme0n2 nvme0n3 nvme1n1 nvme2n1 nvme3n1' '' 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('nvme0n1' 'nvme0n2' 'nvme0n3' 'nvme1n1' 'nvme2n1' 'nvme3n1') 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=6 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=6 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13') 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('nvme0n1' 'nvme0n2' 'nvme0n3' 'nvme1n1' 'nvme2n1' 'nvme3n1') 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=85577 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 85577 /var/tmp/spdk-nbd.sock 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@835 -- # '[' -z 85577 ']' 00:14:57.262 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@840 -- # local max_retries=100 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@844 -- # xtrace_disable 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:14:57.262 05:05:17 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@316 -- # /home/vagrant/spdk_repo/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json '' 00:14:57.262 [2024-12-15 05:05:17.216842] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:14:57.262 [2024-12-15 05:05:17.217205] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:57.262 [2024-12-15 05:05:17.373926] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:57.524 [2024-12-15 05:05:17.402915] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@868 -- # return 0 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'nvme0n1 nvme0n2 nvme0n3 nvme1n1 nvme2n1 nvme3n1' 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('nvme0n1' 'nvme0n2' 'nvme0n3' 'nvme1n1' 'nvme2n1' 'nvme3n1') 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'nvme0n1 nvme0n2 nvme0n3 nvme1n1 nvme2n1 nvme3n1' 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('nvme0n1' 'nvme0n2' 'nvme0n3' 'nvme1n1' 'nvme2n1' 'nvme3n1') 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 6 )) 00:14:58.095 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk nvme0n1 00:14:58.355 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:14:58.355 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:14:58.355 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:14:58.355 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:14:58.355 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:14:58.355 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:14:58.355 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:14:58.355 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:14:58.355 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:14:58.355 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:14:58.355 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:14:58.356 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:14:58.356 1+0 records in 00:14:58.356 1+0 records out 00:14:58.356 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00100391 s, 4.1 MB/s 00:14:58.356 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:14:58.356 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:14:58.356 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:14:58.356 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:14:58.356 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:14:58.356 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:14:58.356 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 6 )) 00:14:58.356 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk nvme0n2 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:14:58.617 1+0 records in 00:14:58.617 1+0 records out 00:14:58.617 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00110364 s, 3.7 MB/s 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 6 )) 00:14:58.617 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk nvme0n3 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd2 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd2 /proc/partitions 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd2 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:14:58.878 1+0 records in 00:14:58.878 1+0 records out 00:14:58.878 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00112941 s, 3.6 MB/s 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 6 )) 00:14:58.878 05:05:18 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk nvme1n1 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd3 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd3 /proc/partitions 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd3 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:14:59.139 1+0 records in 00:14:59.139 1+0 records out 00:14:59.139 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00094343 s, 4.3 MB/s 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 6 )) 00:14:59.139 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk nvme2n1 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd4 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd4 /proc/partitions 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd4 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:14:59.401 1+0 records in 00:14:59.401 1+0 records out 00:14:59.401 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00140948 s, 2.9 MB/s 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 6 )) 00:14:59.401 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk nvme3n1 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd5 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd5 /proc/partitions 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd5 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:14:59.662 1+0 records in 00:14:59.662 1+0 records out 00:14:59.662 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00126957 s, 3.2 MB/s 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 6 )) 00:14:59.662 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@118 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:14:59.924 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:14:59.924 { 00:14:59.924 "nbd_device": "/dev/nbd0", 00:14:59.924 "bdev_name": "nvme0n1" 00:14:59.924 }, 00:14:59.924 { 00:14:59.924 "nbd_device": "/dev/nbd1", 00:14:59.924 "bdev_name": "nvme0n2" 00:14:59.924 }, 00:14:59.924 { 00:14:59.924 "nbd_device": "/dev/nbd2", 00:14:59.924 "bdev_name": "nvme0n3" 00:14:59.924 }, 00:14:59.924 { 00:14:59.924 "nbd_device": "/dev/nbd3", 00:14:59.924 "bdev_name": "nvme1n1" 00:14:59.924 }, 00:14:59.924 { 00:14:59.924 "nbd_device": "/dev/nbd4", 00:14:59.924 "bdev_name": "nvme2n1" 00:14:59.924 }, 00:14:59.924 { 00:14:59.924 "nbd_device": "/dev/nbd5", 00:14:59.924 "bdev_name": "nvme3n1" 00:14:59.924 } 00:14:59.924 ]' 00:14:59.924 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:14:59.924 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:14:59.924 { 00:14:59.924 "nbd_device": "/dev/nbd0", 00:14:59.924 "bdev_name": "nvme0n1" 00:14:59.924 }, 00:14:59.924 { 00:14:59.924 "nbd_device": "/dev/nbd1", 00:14:59.924 "bdev_name": "nvme0n2" 00:14:59.924 }, 00:14:59.924 { 00:14:59.924 "nbd_device": "/dev/nbd2", 00:14:59.924 "bdev_name": "nvme0n3" 00:14:59.924 }, 00:14:59.924 { 00:14:59.924 "nbd_device": "/dev/nbd3", 00:14:59.924 "bdev_name": "nvme1n1" 00:14:59.924 }, 00:14:59.924 { 00:14:59.924 "nbd_device": "/dev/nbd4", 00:14:59.924 "bdev_name": "nvme2n1" 00:14:59.924 }, 00:14:59.924 { 00:14:59.924 "nbd_device": "/dev/nbd5", 00:14:59.924 "bdev_name": "nvme3n1" 00:14:59.924 } 00:14:59.924 ]' 00:14:59.924 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:14:59.924 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5' 00:14:59.924 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:14:59.924 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5') 00:14:59.924 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:14:59.924 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:14:59.924 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:14:59.924 05:05:19 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:15:00.185 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:15:00.447 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:15:00.447 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:15:00.447 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:15:00.447 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:15:00.447 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:15:00.447 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:15:00.447 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:15:00.447 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:15:00.447 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:15:00.447 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:15:00.709 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:15:00.709 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:15:00.709 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:15:00.709 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:15:00.709 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:15:00.709 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:15:00.709 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:15:00.709 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:15:00.709 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:15:00.709 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:15:00.970 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:15:00.970 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:15:00.970 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:15:00.970 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:15:00.970 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:15:00.970 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:15:00.970 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:15:00.970 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:15:00.970 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:15:00.970 05:05:20 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:15:01.231 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:15:01.231 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:15:01.231 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:15:01.231 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:15:01.231 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:15:01.231 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:15:01.231 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:15:01.231 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:15:01.231 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:15:01.231 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:15:01.231 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'nvme0n1 nvme0n2 nvme0n3 nvme1n1 nvme2n1 nvme3n1' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13' 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('nvme0n1' 'nvme0n2' 'nvme0n3' 'nvme1n1' 'nvme2n1' 'nvme3n1') 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13') 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'nvme0n1 nvme0n2 nvme0n3 nvme1n1 nvme2n1 nvme3n1' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13' 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('nvme0n1' 'nvme0n2' 'nvme0n3' 'nvme1n1' 'nvme2n1' 'nvme3n1') 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13') 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 6 )) 00:15:01.493 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk nvme0n1 /dev/nbd0 00:15:01.754 /dev/nbd0 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:15:01.754 1+0 records in 00:15:01.754 1+0 records out 00:15:01.754 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00241784 s, 1.7 MB/s 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 6 )) 00:15:01.754 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk nvme0n2 /dev/nbd1 00:15:02.016 /dev/nbd1 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd1 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd1 /proc/partitions 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd1 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:15:02.016 1+0 records in 00:15:02.016 1+0 records out 00:15:02.016 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000991807 s, 4.1 MB/s 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 6 )) 00:15:02.016 05:05:21 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk nvme0n3 /dev/nbd10 00:15:02.278 /dev/nbd10 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd10 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd10 /proc/partitions 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd10 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:15:02.278 1+0 records in 00:15:02.278 1+0 records out 00:15:02.278 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000899977 s, 4.6 MB/s 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 6 )) 00:15:02.278 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk nvme1n1 /dev/nbd11 00:15:02.278 /dev/nbd11 00:15:02.539 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:15:02.539 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:15:02.539 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd11 00:15:02.539 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:15:02.539 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:15:02.539 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:15:02.539 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd11 /proc/partitions 00:15:02.539 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:15:02.539 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:15:02.539 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:15:02.539 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd11 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:15:02.539 1+0 records in 00:15:02.539 1+0 records out 00:15:02.539 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00099472 s, 4.1 MB/s 00:15:02.539 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:15:02.539 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:15:02.539 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:15:02.540 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:15:02.540 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:15:02.540 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:15:02.540 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 6 )) 00:15:02.540 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk nvme2n1 /dev/nbd12 00:15:02.540 /dev/nbd12 00:15:02.540 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:15:02.540 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:15:02.540 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd12 00:15:02.540 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:15:02.540 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:15:02.540 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:15:02.540 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd12 /proc/partitions 00:15:02.801 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:15:02.801 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:15:02.801 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:15:02.801 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd12 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:15:02.801 1+0 records in 00:15:02.801 1+0 records out 00:15:02.801 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00142502 s, 2.9 MB/s 00:15:02.801 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 6 )) 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@15 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk nvme3n1 /dev/nbd13 00:15:02.802 /dev/nbd13 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@872 -- # local nbd_name=nbd13 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@873 -- # local i 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@876 -- # grep -q -w nbd13 /proc/partitions 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@877 -- # break 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@889 -- # dd if=/dev/nbd13 of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:15:02.802 1+0 records in 00:15:02.802 1+0 records out 00:15:02.802 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00102653 s, 4.0 MB/s 00:15:02.802 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:15:03.063 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@890 -- # size=4096 00:15:03.063 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/nbdtest 00:15:03.063 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:15:03.063 05:05:22 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@893 -- # return 0 00:15:03.063 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:15:03.063 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 6 )) 00:15:03.063 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:15:03.063 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:15:03.063 05:05:22 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:15:03.063 { 00:15:03.063 "nbd_device": "/dev/nbd0", 00:15:03.063 "bdev_name": "nvme0n1" 00:15:03.063 }, 00:15:03.063 { 00:15:03.063 "nbd_device": "/dev/nbd1", 00:15:03.063 "bdev_name": "nvme0n2" 00:15:03.063 }, 00:15:03.063 { 00:15:03.063 "nbd_device": "/dev/nbd10", 00:15:03.063 "bdev_name": "nvme0n3" 00:15:03.063 }, 00:15:03.063 { 00:15:03.063 "nbd_device": "/dev/nbd11", 00:15:03.063 "bdev_name": "nvme1n1" 00:15:03.063 }, 00:15:03.063 { 00:15:03.063 "nbd_device": "/dev/nbd12", 00:15:03.063 "bdev_name": "nvme2n1" 00:15:03.063 }, 00:15:03.063 { 00:15:03.063 "nbd_device": "/dev/nbd13", 00:15:03.063 "bdev_name": "nvme3n1" 00:15:03.063 } 00:15:03.063 ]' 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:15:03.063 { 00:15:03.063 "nbd_device": "/dev/nbd0", 00:15:03.063 "bdev_name": "nvme0n1" 00:15:03.063 }, 00:15:03.063 { 00:15:03.063 "nbd_device": "/dev/nbd1", 00:15:03.063 "bdev_name": "nvme0n2" 00:15:03.063 }, 00:15:03.063 { 00:15:03.063 "nbd_device": "/dev/nbd10", 00:15:03.063 "bdev_name": "nvme0n3" 00:15:03.063 }, 00:15:03.063 { 00:15:03.063 "nbd_device": "/dev/nbd11", 00:15:03.063 "bdev_name": "nvme1n1" 00:15:03.063 }, 00:15:03.063 { 00:15:03.063 "nbd_device": "/dev/nbd12", 00:15:03.063 "bdev_name": "nvme2n1" 00:15:03.063 }, 00:15:03.063 { 00:15:03.063 "nbd_device": "/dev/nbd13", 00:15:03.063 "bdev_name": "nvme3n1" 00:15:03.063 } 00:15:03.063 ]' 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:15:03.063 /dev/nbd1 00:15:03.063 /dev/nbd10 00:15:03.063 /dev/nbd11 00:15:03.063 /dev/nbd12 00:15:03.063 /dev/nbd13' 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:15:03.063 /dev/nbd1 00:15:03.063 /dev/nbd10 00:15:03.063 /dev/nbd11 00:15:03.063 /dev/nbd12 00:15:03.063 /dev/nbd13' 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=6 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 6 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=6 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 6 -ne 6 ']' 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13' write 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13') 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:15:03.063 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:15:03.324 256+0 records in 00:15:03.324 256+0 records out 00:15:03.324 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0128744 s, 81.4 MB/s 00:15:03.324 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:15:03.324 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:15:03.324 256+0 records in 00:15:03.324 256+0 records out 00:15:03.324 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.237566 s, 4.4 MB/s 00:15:03.324 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:15:03.324 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:15:03.586 256+0 records in 00:15:03.586 256+0 records out 00:15:03.586 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.239836 s, 4.4 MB/s 00:15:03.586 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:15:03.586 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:15:03.848 256+0 records in 00:15:03.848 256+0 records out 00:15:03.848 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.236998 s, 4.4 MB/s 00:15:03.848 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:15:03.848 05:05:23 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:15:04.110 256+0 records in 00:15:04.110 256+0 records out 00:15:04.110 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.196538 s, 5.3 MB/s 00:15:04.110 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:15:04.110 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:15:04.371 256+0 records in 00:15:04.371 256+0 records out 00:15:04.371 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.299679 s, 3.5 MB/s 00:15:04.371 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:15:04.371 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:15:04.633 256+0 records in 00:15:04.633 256+0 records out 00:15:04.633 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.236517 s, 4.4 MB/s 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13' verify 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13') 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd0 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd1 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd10 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd11 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd12 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest /dev/nbd13 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /home/vagrant/spdk_repo/spdk/test/bdev/nbdrandtest 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13' 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13') 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:15:04.633 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:15:04.897 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:15:04.897 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:15:04.897 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:15:04.897 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:15:04.897 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:15:04.897 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:15:04.897 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:15:04.897 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:15:04.897 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:15:04.897 05:05:24 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:15:05.159 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:15:05.159 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:15:05.159 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:15:05.159 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:15:05.159 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:15:05.159 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:15:05.159 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:15:05.159 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:15:05.159 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:15:05.159 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:15:05.419 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:15:05.419 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:15:05.419 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:15:05.419 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:15:05.419 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:15:05.419 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:15:05.419 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:15:05.419 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:15:05.419 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:15:05.419 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:15:05.680 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:15:05.680 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:15:05.680 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:15:05.680 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:15:05.680 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:15:05.680 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:15:05.680 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:15:05.680 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:15:05.680 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:15:05.680 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:15:05.941 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:15:05.941 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:15:05.941 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:15:05.941 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:15:05.941 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:15:05.941 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:15:05.941 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:15:05.941 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:15:05.941 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:15:05.941 05:05:25 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:15:05.941 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:15:05.941 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:15:05.941 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:15:05.941 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:15:05.941 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:15:05.941 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:15:05.941 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:15:05.941 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:15:05.941 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:15:05.941 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:15:05.941 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@63 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock /dev/nbd0 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd=/dev/nbd0 00:15:06.202 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@134 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:15:06.463 malloc_lvol_verify 00:15:06.463 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@135 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:15:06.724 96f97dbc-d03e-4824-af43-251c12645ce5 00:15:06.724 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@136 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:15:06.985 dd3ed53c-3ef1-4b43-8df6-cff2429cbacc 00:15:06.985 05:05:26 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@137 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:15:06.985 /dev/nbd0 00:15:06.985 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@139 -- # wait_for_nbd_set_capacity /dev/nbd0 00:15:06.985 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@146 -- # local nbd=nbd0 00:15:06.985 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@148 -- # [[ -e /sys/block/nbd0/size ]] 00:15:06.985 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@150 -- # (( 8192 == 0 )) 00:15:06.985 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs.ext4 /dev/nbd0 00:15:06.985 mke2fs 1.47.0 (5-Feb-2023) 00:15:06.985 Discarding device blocks: 0/4096 done 00:15:06.985 Creating filesystem with 4096 1k blocks and 1024 inodes 00:15:06.985 00:15:06.985 Allocating group tables: 0/1 done 00:15:06.985 Writing inode tables: 0/1 done 00:15:06.985 Creating journal (1024 blocks): done 00:15:06.985 Writing superblocks and filesystem accounting information: 0/1 done 00:15:06.985 00:15:06.985 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:15:06.985 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:15:06.985 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:15:06.985 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:15:06.985 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:15:06.985 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:15:06.985 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@54 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 85577 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@954 -- # '[' -z 85577 ']' 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@958 -- # kill -0 85577 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@959 -- # uname 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 85577 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:15:07.246 killing process with pid 85577 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@972 -- # echo 'killing process with pid 85577' 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@973 -- # kill 85577 00:15:07.246 05:05:27 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@978 -- # wait 85577 00:15:07.508 05:05:27 blockdev_xnvme.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:15:07.508 00:15:07.508 real 0m10.325s 00:15:07.508 user 0m14.026s 00:15:07.508 sys 0m3.801s 00:15:07.508 05:05:27 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@1130 -- # xtrace_disable 00:15:07.508 ************************************ 00:15:07.508 END TEST bdev_nbd 00:15:07.508 ************************************ 00:15:07.508 05:05:27 blockdev_xnvme.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:15:07.508 05:05:27 blockdev_xnvme -- bdev/blockdev.sh@800 -- # [[ y == y ]] 00:15:07.508 05:05:27 blockdev_xnvme -- bdev/blockdev.sh@801 -- # '[' xnvme = nvme ']' 00:15:07.508 05:05:27 blockdev_xnvme -- bdev/blockdev.sh@801 -- # '[' xnvme = gpt ']' 00:15:07.508 05:05:27 blockdev_xnvme -- bdev/blockdev.sh@805 -- # run_test bdev_fio fio_test_suite '' 00:15:07.508 05:05:27 blockdev_xnvme -- common/autotest_common.sh@1105 -- # '[' 3 -le 1 ']' 00:15:07.508 05:05:27 blockdev_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:15:07.508 05:05:27 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:15:07.508 ************************************ 00:15:07.508 START TEST bdev_fio 00:15:07.508 ************************************ 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1129 -- # fio_test_suite '' 00:15:07.508 /home/vagrant/spdk_repo/spdk/test/bdev /home/vagrant/spdk_repo/spdk 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /home/vagrant/spdk_repo/spdk/test/bdev 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio verify AIO '' 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1284 -- # local config_file=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1285 -- # local workload=verify 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1286 -- # local bdev_type=AIO 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1287 -- # local env_context= 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1288 -- # local fio_dir=/usr/src/fio 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -e /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio ']' 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -z verify ']' 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1299 -- # '[' -n '' ']' 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1303 -- # touch /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1305 -- # cat 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1317 -- # '[' verify == verify ']' 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1318 -- # cat 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1327 -- # '[' AIO == AIO ']' 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1328 -- # /usr/src/fio/fio --version 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1328 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1329 -- # echo serialize_overlap=1 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_nvme0n1]' 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=nvme0n1 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_nvme0n2]' 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=nvme0n2 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_nvme0n3]' 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=nvme0n3 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_nvme1n1]' 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=nvme1n1 00:15:07.508 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_nvme2n1]' 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=nvme2n1 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_nvme3n1]' 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=nvme3n1 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json' 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/home/vagrant/spdk_repo/spdk/../output 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1105 -- # '[' 11 -le 1 ']' 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1111 -- # xtrace_disable 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:15:07.509 ************************************ 00:15:07.509 START TEST bdev_fio_rw_verify 00:15:07.509 ************************************ 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1129 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/home/vagrant/spdk_repo/spdk/../output 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/home/vagrant/spdk_repo/spdk/../output 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local sanitizers 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # shift 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1347 -- # local asan_lib= 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1349 -- # grep libasan 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # break 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:15:07.509 05:05:27 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/home/vagrant/spdk_repo/spdk/../output 00:15:07.885 job_nvme0n1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:15:07.885 job_nvme0n2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:15:07.885 job_nvme0n3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:15:07.885 job_nvme1n1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:15:07.885 job_nvme2n1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:15:07.885 job_nvme3n1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:15:07.885 fio-3.35 00:15:07.885 Starting 6 threads 00:15:20.123 00:15:20.123 job_nvme0n1: (groupid=0, jobs=6): err= 0: pid=85979: Sun Dec 15 05:05:38 2024 00:15:20.123 read: IOPS=16.6k, BW=64.8MiB/s (68.0MB/s)(649MiB/10002msec) 00:15:20.123 slat (usec): min=2, max=2094, avg= 6.37, stdev=18.83 00:15:20.123 clat (usec): min=87, max=6632, avg=1116.76, stdev=736.59 00:15:20.123 lat (usec): min=90, max=6637, avg=1123.13, stdev=737.30 00:15:20.123 clat percentiles (usec): 00:15:20.123 | 50.000th=[ 979], 99.000th=[ 3458], 99.900th=[ 4817], 99.990th=[ 6390], 00:15:20.123 | 99.999th=[ 6587] 00:15:20.123 write: IOPS=16.9k, BW=65.9MiB/s (69.1MB/s)(659MiB/10002msec); 0 zone resets 00:15:20.123 slat (usec): min=13, max=5689, avg=41.82, stdev=144.08 00:15:20.123 clat (usec): min=71, max=8477, avg=1421.40, stdev=827.98 00:15:20.123 lat (usec): min=88, max=8509, avg=1463.22, stdev=842.02 00:15:20.123 clat percentiles (usec): 00:15:20.123 | 50.000th=[ 1287], 99.000th=[ 4015], 99.900th=[ 5538], 99.990th=[ 6980], 00:15:20.123 | 99.999th=[ 8455] 00:15:20.123 bw ( KiB/s): min=49446, max=92511, per=100.00%, avg=67594.00, stdev=2069.86, samples=114 00:15:20.123 iops : min=12361, max=23127, avg=16897.84, stdev=517.43, samples=114 00:15:20.123 lat (usec) : 100=0.01%, 250=4.26%, 500=11.14%, 750=13.71%, 1000=14.11% 00:15:20.123 lat (msec) : 2=40.69%, 4=15.37%, 10=0.71% 00:15:20.123 cpu : usr=38.70%, sys=34.97%, ctx=6135, majf=0, minf=16188 00:15:20.123 IO depths : 1=11.1%, 2=23.5%, 4=51.4%, 8=13.9%, 16=0.0%, 32=0.0%, >=64=0.0% 00:15:20.123 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:20.123 complete : 0=0.0%, 4=89.2%, 8=10.8%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:15:20.123 issued rwts: total=166018,168734,0,0 short=0,0,0,0 dropped=0,0,0,0 00:15:20.123 latency : target=0, window=0, percentile=100.00%, depth=8 00:15:20.123 00:15:20.123 Run status group 0 (all jobs): 00:15:20.123 READ: bw=64.8MiB/s (68.0MB/s), 64.8MiB/s-64.8MiB/s (68.0MB/s-68.0MB/s), io=649MiB (680MB), run=10002-10002msec 00:15:20.123 WRITE: bw=65.9MiB/s (69.1MB/s), 65.9MiB/s-65.9MiB/s (69.1MB/s-69.1MB/s), io=659MiB (691MB), run=10002-10002msec 00:15:20.123 ----------------------------------------------------- 00:15:20.123 Suppressions used: 00:15:20.123 count bytes template 00:15:20.123 6 48 /usr/src/fio/parse.c 00:15:20.123 2610 250560 /usr/src/fio/iolog.c 00:15:20.123 1 8 libtcmalloc_minimal.so 00:15:20.123 1 904 libcrypto.so 00:15:20.123 ----------------------------------------------------- 00:15:20.123 00:15:20.123 00:15:20.123 real 0m11.107s 00:15:20.123 user 0m23.939s 00:15:20.123 sys 0m21.276s 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1130 -- # xtrace_disable 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:15:20.123 ************************************ 00:15:20.123 END TEST bdev_fio_rw_verify 00:15:20.123 ************************************ 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio trim '' '' 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1284 -- # local config_file=/home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1285 -- # local workload=trim 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1286 -- # local bdev_type= 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1287 -- # local env_context= 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1288 -- # local fio_dir=/usr/src/fio 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -e /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio ']' 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -z trim ']' 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1299 -- # '[' -n '' ']' 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1303 -- # touch /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1305 -- # cat 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1317 -- # '[' trim == verify ']' 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1332 -- # '[' trim == trim ']' 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1333 -- # echo rw=trimwrite 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "nvme0n1",' ' "aliases": [' ' "b7cd1c7e-35d8-49b3-bd6e-990b13a1a8d1"' ' ],' ' "product_name": "xNVMe bdev",' ' "block_size": 4096,' ' "num_blocks": 1048576,' ' "uuid": "b7cd1c7e-35d8-49b3-bd6e-990b13a1a8d1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": false,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {}' '}' '{' ' "name": "nvme0n2",' ' "aliases": [' ' "c1707fbd-5af2-4521-8bf2-5fdb1d8d6215"' ' ],' ' "product_name": "xNVMe bdev",' ' "block_size": 4096,' ' "num_blocks": 1048576,' ' "uuid": "c1707fbd-5af2-4521-8bf2-5fdb1d8d6215",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": false,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {}' '}' '{' ' "name": "nvme0n3",' ' "aliases": [' ' "beae6bab-c0ea-4531-bc65-4d27e14bc31c"' ' ],' ' "product_name": "xNVMe bdev",' ' "block_size": 4096,' ' "num_blocks": 1048576,' ' "uuid": "beae6bab-c0ea-4531-bc65-4d27e14bc31c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": false,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {}' '}' '{' ' "name": "nvme1n1",' ' "aliases": [' ' "35a7846c-bb98-4117-8772-a9c853d4f587"' ' ],' ' "product_name": "xNVMe bdev",' ' "block_size": 4096,' ' "num_blocks": 262144,' ' "uuid": "35a7846c-bb98-4117-8772-a9c853d4f587",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": false,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {}' '}' '{' ' "name": "nvme2n1",' ' "aliases": [' ' "f099eb83-0b1e-44cf-8d11-379b378b416b"' ' ],' ' "product_name": "xNVMe bdev",' ' "block_size": 4096,' ' "num_blocks": 1548666,' ' "uuid": "f099eb83-0b1e-44cf-8d11-379b378b416b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": false,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {}' '}' '{' ' "name": "nvme3n1",' ' "aliases": [' ' "e1d549eb-3687-4b4e-a157-3e367f0b8027"' ' ],' ' "product_name": "xNVMe bdev",' ' "block_size": 4096,' ' "num_blocks": 1310720,' ' "uuid": "e1d549eb-3687-4b4e-a157-3e367f0b8027",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": false,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {}' '}' 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n '' ]] 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@360 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/bdev.fio 00:15:20.123 /home/vagrant/spdk_repo/spdk 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@361 -- # popd 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@362 -- # trap - SIGINT SIGTERM EXIT 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- bdev/blockdev.sh@363 -- # return 0 00:15:20.123 00:15:20.123 real 0m11.277s 00:15:20.123 user 0m24.010s 00:15:20.123 sys 0m21.351s 00:15:20.123 ************************************ 00:15:20.123 END TEST bdev_fio 00:15:20.123 ************************************ 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@1130 -- # xtrace_disable 00:15:20.123 05:05:38 blockdev_xnvme.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:15:20.124 05:05:38 blockdev_xnvme -- bdev/blockdev.sh@812 -- # trap cleanup SIGINT SIGTERM EXIT 00:15:20.124 05:05:38 blockdev_xnvme -- bdev/blockdev.sh@814 -- # run_test bdev_verify /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:15:20.124 05:05:38 blockdev_xnvme -- common/autotest_common.sh@1105 -- # '[' 16 -le 1 ']' 00:15:20.124 05:05:38 blockdev_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:15:20.124 05:05:38 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:15:20.124 ************************************ 00:15:20.124 START TEST bdev_verify 00:15:20.124 ************************************ 00:15:20.124 05:05:38 blockdev_xnvme.bdev_verify -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:15:20.124 [2024-12-15 05:05:38.936220] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:15:20.124 [2024-12-15 05:05:38.936359] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid86145 ] 00:15:20.124 [2024-12-15 05:05:39.099239] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:15:20.124 [2024-12-15 05:05:39.129650] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:15:20.124 [2024-12-15 05:05:39.129776] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:15:20.124 Running I/O for 5 seconds... 00:15:21.639 23744.00 IOPS, 92.75 MiB/s [2024-12-15T05:05:42.722Z] 23951.00 IOPS, 93.56 MiB/s [2024-12-15T05:05:43.666Z] 23562.67 IOPS, 92.04 MiB/s [2024-12-15T05:05:44.608Z] 23991.50 IOPS, 93.72 MiB/s [2024-12-15T05:05:44.608Z] 23769.60 IOPS, 92.85 MiB/s 00:15:24.468 Latency(us) 00:15:24.468 [2024-12-15T05:05:44.608Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:15:24.468 Job: nvme0n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:15:24.469 Verification LBA range: start 0x0 length 0x80000 00:15:24.469 nvme0n1 : 5.03 1757.09 6.86 0.00 0.00 72719.58 11594.83 63317.86 00:15:24.469 Job: nvme0n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:15:24.469 Verification LBA range: start 0x80000 length 0x80000 00:15:24.469 nvme0n1 : 5.02 1936.10 7.56 0.00 0.00 65982.10 7763.50 75013.51 00:15:24.469 Job: nvme0n2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:15:24.469 Verification LBA range: start 0x0 length 0x80000 00:15:24.469 nvme0n2 : 5.03 1754.14 6.85 0.00 0.00 72709.31 8973.39 71787.13 00:15:24.469 Job: nvme0n2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:15:24.469 Verification LBA range: start 0x80000 length 0x80000 00:15:24.469 nvme0n2 : 5.08 1939.46 7.58 0.00 0.00 65718.32 11796.48 68964.04 00:15:24.469 Job: nvme0n3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:15:24.469 Verification LBA range: start 0x0 length 0x80000 00:15:24.469 nvme0n3 : 5.05 1750.48 6.84 0.00 0.00 72743.03 15627.82 64527.75 00:15:24.469 Job: nvme0n3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:15:24.469 Verification LBA range: start 0x80000 length 0x80000 00:15:24.469 nvme0n3 : 5.06 1922.39 7.51 0.00 0.00 66149.28 11141.12 64124.46 00:15:24.469 Job: nvme1n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:15:24.469 Verification LBA range: start 0x0 length 0x20000 00:15:24.469 nvme1n1 : 5.05 1749.87 6.84 0.00 0.00 72642.97 10132.87 72593.72 00:15:24.469 Job: nvme1n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:15:24.469 Verification LBA range: start 0x20000 length 0x20000 00:15:24.469 nvme1n1 : 5.09 1935.66 7.56 0.00 0.00 65554.78 11947.72 68157.44 00:15:24.469 Job: nvme2n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:15:24.469 Verification LBA range: start 0x0 length 0xbd0bd 00:15:24.469 nvme2n1 : 5.06 2508.62 9.80 0.00 0.00 50534.73 5494.94 53638.70 00:15:24.469 Job: nvme2n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:15:24.469 Verification LBA range: start 0xbd0bd length 0xbd0bd 00:15:24.469 nvme2n1 : 5.09 2522.84 9.85 0.00 0.00 50141.56 5923.45 55251.89 00:15:24.469 Job: nvme3n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:15:24.469 Verification LBA range: start 0x0 length 0xa0000 00:15:24.469 nvme3n1 : 5.06 1797.16 7.02 0.00 0.00 70508.91 6175.51 69367.34 00:15:24.469 Job: nvme3n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:15:24.469 Verification LBA range: start 0xa0000 length 0xa0000 00:15:24.469 nvme3n1 : 5.09 1962.22 7.66 0.00 0.00 64443.69 6200.71 74206.92 00:15:24.469 [2024-12-15T05:05:44.609Z] =================================================================================================================== 00:15:24.469 [2024-12-15T05:05:44.609Z] Total : 23536.03 91.94 0.00 0.00 64794.46 5494.94 75013.51 00:15:24.729 00:15:24.729 real 0m5.858s 00:15:24.729 user 0m9.216s 00:15:24.729 sys 0m1.569s 00:15:24.729 05:05:44 blockdev_xnvme.bdev_verify -- common/autotest_common.sh@1130 -- # xtrace_disable 00:15:24.729 05:05:44 blockdev_xnvme.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:15:24.729 ************************************ 00:15:24.729 END TEST bdev_verify 00:15:24.729 ************************************ 00:15:24.729 05:05:44 blockdev_xnvme -- bdev/blockdev.sh@815 -- # run_test bdev_verify_big_io /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:15:24.729 05:05:44 blockdev_xnvme -- common/autotest_common.sh@1105 -- # '[' 16 -le 1 ']' 00:15:24.729 05:05:44 blockdev_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:15:24.729 05:05:44 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:15:24.729 ************************************ 00:15:24.729 START TEST bdev_verify_big_io 00:15:24.729 ************************************ 00:15:24.729 05:05:44 blockdev_xnvme.bdev_verify_big_io -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:15:24.990 [2024-12-15 05:05:44.871750] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:15:24.990 [2024-12-15 05:05:44.871895] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid86235 ] 00:15:24.990 [2024-12-15 05:05:45.034970] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:15:24.990 [2024-12-15 05:05:45.065564] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:15:24.990 [2024-12-15 05:05:45.065604] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:15:25.250 Running I/O for 5 seconds... 00:15:31.373 1448.00 IOPS, 90.50 MiB/s [2024-12-15T05:05:51.513Z] 3644.00 IOPS, 227.75 MiB/s 00:15:31.373 Latency(us) 00:15:31.373 [2024-12-15T05:05:51.513Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:15:31.373 Job: nvme0n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:15:31.373 Verification LBA range: start 0x0 length 0x8000 00:15:31.373 nvme0n1 : 6.03 84.94 5.31 0.00 0.00 1466406.67 6326.74 1258291.20 00:15:31.373 Job: nvme0n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:15:31.373 Verification LBA range: start 0x8000 length 0x8000 00:15:31.373 nvme0n1 : 5.68 135.16 8.45 0.00 0.00 933523.04 136314.88 890483.00 00:15:31.373 Job: nvme0n2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:15:31.373 Verification LBA range: start 0x0 length 0x8000 00:15:31.373 nvme0n2 : 5.80 82.72 5.17 0.00 0.00 1442012.13 177451.32 2103604.78 00:15:31.373 Job: nvme0n2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:15:31.373 Verification LBA range: start 0x8000 length 0x8000 00:15:31.373 nvme0n2 : 5.69 123.74 7.73 0.00 0.00 988548.19 6755.25 1703532.70 00:15:31.373 Job: nvme0n3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:15:31.373 Verification LBA range: start 0x0 length 0x8000 00:15:31.373 nvme0n3 : 5.90 108.48 6.78 0.00 0.00 1075908.92 90742.15 922746.88 00:15:31.373 Job: nvme0n3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:15:31.373 Verification LBA range: start 0x8000 length 0x8000 00:15:31.373 nvme0n3 : 5.79 132.70 8.29 0.00 0.00 894581.10 158093.00 748521.94 00:15:31.373 Job: nvme1n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:15:31.373 Verification LBA range: start 0x0 length 0x2000 00:15:31.373 nvme1n1 : 5.91 75.83 4.74 0.00 0.00 1495669.70 38716.65 3716798.62 00:15:31.373 Job: nvme1n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:15:31.373 Verification LBA range: start 0x2000 length 0x2000 00:15:31.373 nvme1n1 : 5.79 176.84 11.05 0.00 0.00 658826.14 16434.41 751748.33 00:15:31.373 Job: nvme2n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:15:31.373 Verification LBA range: start 0x0 length 0xbd0b 00:15:31.373 nvme2n1 : 5.91 83.93 5.25 0.00 0.00 1295831.09 26214.40 3174765.49 00:15:31.373 Job: nvme2n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:15:31.373 Verification LBA range: start 0xbd0b length 0xbd0b 00:15:31.373 nvme2n1 : 5.78 143.88 8.99 0.00 0.00 784071.10 5595.77 1535760.54 00:15:31.373 Job: nvme3n1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:15:31.373 Verification LBA range: start 0x0 length 0xa000 00:15:31.373 nvme3n1 : 6.03 179.03 11.19 0.00 0.00 579311.56 1083.86 2026171.47 00:15:31.373 Job: nvme3n1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:15:31.373 Verification LBA range: start 0xa000 length 0xa000 00:15:31.373 nvme3n1 : 5.79 154.63 9.66 0.00 0.00 712118.69 6654.42 838860.80 00:15:31.373 [2024-12-15T05:05:51.513Z] =================================================================================================================== 00:15:31.373 [2024-12-15T05:05:51.513Z] Total : 1481.87 92.62 0.00 0.00 940675.52 1083.86 3716798.62 00:15:31.634 00:15:31.634 real 0m6.848s 00:15:31.634 user 0m12.559s 00:15:31.634 sys 0m0.447s 00:15:31.634 05:05:51 blockdev_xnvme.bdev_verify_big_io -- common/autotest_common.sh@1130 -- # xtrace_disable 00:15:31.634 05:05:51 blockdev_xnvme.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:15:31.634 ************************************ 00:15:31.634 END TEST bdev_verify_big_io 00:15:31.634 ************************************ 00:15:31.634 05:05:51 blockdev_xnvme -- bdev/blockdev.sh@816 -- # run_test bdev_write_zeroes /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:15:31.634 05:05:51 blockdev_xnvme -- common/autotest_common.sh@1105 -- # '[' 13 -le 1 ']' 00:15:31.634 05:05:51 blockdev_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:15:31.634 05:05:51 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:15:31.634 ************************************ 00:15:31.634 START TEST bdev_write_zeroes 00:15:31.634 ************************************ 00:15:31.634 05:05:51 blockdev_xnvme.bdev_write_zeroes -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:15:31.895 [2024-12-15 05:05:51.799823] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:15:31.895 [2024-12-15 05:05:51.800164] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid86335 ] 00:15:31.895 [2024-12-15 05:05:51.963233] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:31.895 [2024-12-15 05:05:51.992209] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:15:32.156 Running I/O for 1 seconds... 00:15:33.543 71680.00 IOPS, 280.00 MiB/s 00:15:33.543 Latency(us) 00:15:33.543 [2024-12-15T05:05:53.683Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:15:33.543 Job: nvme0n1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:15:33.543 nvme0n1 : 1.02 11870.18 46.37 0.00 0.00 10771.98 8368.44 21878.94 00:15:33.543 Job: nvme0n2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:15:33.543 nvme0n2 : 1.01 11940.81 46.64 0.00 0.00 10697.47 8418.86 20870.70 00:15:33.543 Job: nvme0n3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:15:33.543 nvme0n3 : 1.03 11849.75 46.29 0.00 0.00 10770.50 8469.27 20366.57 00:15:33.543 Job: nvme1n1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:15:33.543 nvme1n1 : 1.03 11836.45 46.24 0.00 0.00 10769.43 8469.27 20366.57 00:15:33.543 Job: nvme2n1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:15:33.543 nvme2n1 : 1.02 12010.72 46.92 0.00 0.00 10602.00 7612.26 22685.54 00:15:33.543 Job: nvme3n1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:15:33.543 nvme3n1 : 1.03 11822.41 46.18 0.00 0.00 10763.91 8418.86 20769.87 00:15:33.543 [2024-12-15T05:05:53.683Z] =================================================================================================================== 00:15:33.543 [2024-12-15T05:05:53.683Z] Total : 71330.32 278.63 0.00 0.00 10729.05 7612.26 22685.54 00:15:33.544 00:15:33.544 real 0m1.766s 00:15:33.544 user 0m1.058s 00:15:33.544 sys 0m0.518s 00:15:33.544 05:05:53 blockdev_xnvme.bdev_write_zeroes -- common/autotest_common.sh@1130 -- # xtrace_disable 00:15:33.544 05:05:53 blockdev_xnvme.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:15:33.544 ************************************ 00:15:33.544 END TEST bdev_write_zeroes 00:15:33.544 ************************************ 00:15:33.544 05:05:53 blockdev_xnvme -- bdev/blockdev.sh@819 -- # run_test bdev_json_nonenclosed /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:15:33.544 05:05:53 blockdev_xnvme -- common/autotest_common.sh@1105 -- # '[' 13 -le 1 ']' 00:15:33.544 05:05:53 blockdev_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:15:33.544 05:05:53 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:15:33.544 ************************************ 00:15:33.544 START TEST bdev_json_nonenclosed 00:15:33.544 ************************************ 00:15:33.544 05:05:53 blockdev_xnvme.bdev_json_nonenclosed -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:15:33.544 [2024-12-15 05:05:53.638028] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:15:33.544 [2024-12-15 05:05:53.638354] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid86372 ] 00:15:33.805 [2024-12-15 05:05:53.802249] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:33.805 [2024-12-15 05:05:53.830974] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:15:33.805 [2024-12-15 05:05:53.831079] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:15:33.805 [2024-12-15 05:05:53.831101] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:15:33.805 [2024-12-15 05:05:53.831114] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:15:33.805 00:15:33.805 real 0m0.343s 00:15:33.805 user 0m0.131s 00:15:33.805 sys 0m0.107s 00:15:33.805 05:05:53 blockdev_xnvme.bdev_json_nonenclosed -- common/autotest_common.sh@1130 -- # xtrace_disable 00:15:33.805 05:05:53 blockdev_xnvme.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:15:33.805 ************************************ 00:15:33.805 END TEST bdev_json_nonenclosed 00:15:33.805 ************************************ 00:15:34.067 05:05:53 blockdev_xnvme -- bdev/blockdev.sh@822 -- # run_test bdev_json_nonarray /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:15:34.067 05:05:53 blockdev_xnvme -- common/autotest_common.sh@1105 -- # '[' 13 -le 1 ']' 00:15:34.067 05:05:53 blockdev_xnvme -- common/autotest_common.sh@1111 -- # xtrace_disable 00:15:34.067 05:05:53 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:15:34.067 ************************************ 00:15:34.067 START TEST bdev_json_nonarray 00:15:34.067 ************************************ 00:15:34.067 05:05:53 blockdev_xnvme.bdev_json_nonarray -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf --json /home/vagrant/spdk_repo/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:15:34.067 [2024-12-15 05:05:54.047214] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:15:34.067 [2024-12-15 05:05:54.047619] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid86398 ] 00:15:34.328 [2024-12-15 05:05:54.210053] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:34.328 [2024-12-15 05:05:54.238314] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:15:34.328 [2024-12-15 05:05:54.238424] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:15:34.328 [2024-12-15 05:05:54.238463] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:15:34.328 [2024-12-15 05:05:54.238480] app.c:1064:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:15:34.328 00:15:34.328 real 0m0.339s 00:15:34.328 user 0m0.128s 00:15:34.328 sys 0m0.107s 00:15:34.328 05:05:54 blockdev_xnvme.bdev_json_nonarray -- common/autotest_common.sh@1130 -- # xtrace_disable 00:15:34.328 ************************************ 00:15:34.328 END TEST bdev_json_nonarray 00:15:34.328 ************************************ 00:15:34.328 05:05:54 blockdev_xnvme.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:15:34.328 05:05:54 blockdev_xnvme -- bdev/blockdev.sh@824 -- # [[ xnvme == bdev ]] 00:15:34.328 05:05:54 blockdev_xnvme -- bdev/blockdev.sh@832 -- # [[ xnvme == gpt ]] 00:15:34.328 05:05:54 blockdev_xnvme -- bdev/blockdev.sh@836 -- # [[ xnvme == crypto_sw ]] 00:15:34.328 05:05:54 blockdev_xnvme -- bdev/blockdev.sh@848 -- # trap - SIGINT SIGTERM EXIT 00:15:34.328 05:05:54 blockdev_xnvme -- bdev/blockdev.sh@849 -- # cleanup 00:15:34.328 05:05:54 blockdev_xnvme -- bdev/blockdev.sh@23 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/aiofile 00:15:34.328 05:05:54 blockdev_xnvme -- bdev/blockdev.sh@24 -- # rm -f /home/vagrant/spdk_repo/spdk/test/bdev/bdev.json 00:15:34.328 05:05:54 blockdev_xnvme -- bdev/blockdev.sh@26 -- # [[ xnvme == rbd ]] 00:15:34.328 05:05:54 blockdev_xnvme -- bdev/blockdev.sh@30 -- # [[ xnvme == daos ]] 00:15:34.328 05:05:54 blockdev_xnvme -- bdev/blockdev.sh@34 -- # [[ xnvme = \g\p\t ]] 00:15:34.328 05:05:54 blockdev_xnvme -- bdev/blockdev.sh@40 -- # [[ xnvme == xnvme ]] 00:15:34.328 05:05:54 blockdev_xnvme -- bdev/blockdev.sh@41 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:15:34.901 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:15:43.049 0000:00:10.0 (1b36 0010): nvme -> uio_pci_generic 00:15:44.960 0000:00:13.0 (1b36 0010): nvme -> uio_pci_generic 00:15:44.960 0000:00:11.0 (1b36 0010): nvme -> uio_pci_generic 00:15:44.960 0000:00:12.0 (1b36 0010): nvme -> uio_pci_generic 00:15:44.960 00:15:44.960 real 0m52.860s 00:15:44.960 user 1m8.919s 00:15:44.960 sys 0m47.877s 00:15:44.960 05:06:04 blockdev_xnvme -- common/autotest_common.sh@1130 -- # xtrace_disable 00:15:44.960 05:06:04 blockdev_xnvme -- common/autotest_common.sh@10 -- # set +x 00:15:44.960 ************************************ 00:15:44.960 END TEST blockdev_xnvme 00:15:44.960 ************************************ 00:15:44.960 05:06:04 -- spdk/autotest.sh@247 -- # run_test ublk /home/vagrant/spdk_repo/spdk/test/ublk/ublk.sh 00:15:44.960 05:06:04 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:15:44.960 05:06:04 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:15:44.960 05:06:04 -- common/autotest_common.sh@10 -- # set +x 00:15:44.960 ************************************ 00:15:44.960 START TEST ublk 00:15:44.960 ************************************ 00:15:44.960 05:06:04 ublk -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/ublk/ublk.sh 00:15:44.960 * Looking for test storage... 00:15:44.960 * Found test storage at /home/vagrant/spdk_repo/spdk/test/ublk 00:15:44.960 05:06:05 ublk -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:15:44.960 05:06:05 ublk -- common/autotest_common.sh@1711 -- # lcov --version 00:15:44.960 05:06:05 ublk -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:15:45.221 05:06:05 ublk -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:15:45.221 05:06:05 ublk -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:15:45.221 05:06:05 ublk -- scripts/common.sh@333 -- # local ver1 ver1_l 00:15:45.221 05:06:05 ublk -- scripts/common.sh@334 -- # local ver2 ver2_l 00:15:45.221 05:06:05 ublk -- scripts/common.sh@336 -- # IFS=.-: 00:15:45.221 05:06:05 ublk -- scripts/common.sh@336 -- # read -ra ver1 00:15:45.221 05:06:05 ublk -- scripts/common.sh@337 -- # IFS=.-: 00:15:45.221 05:06:05 ublk -- scripts/common.sh@337 -- # read -ra ver2 00:15:45.221 05:06:05 ublk -- scripts/common.sh@338 -- # local 'op=<' 00:15:45.221 05:06:05 ublk -- scripts/common.sh@340 -- # ver1_l=2 00:15:45.221 05:06:05 ublk -- scripts/common.sh@341 -- # ver2_l=1 00:15:45.221 05:06:05 ublk -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:15:45.221 05:06:05 ublk -- scripts/common.sh@344 -- # case "$op" in 00:15:45.221 05:06:05 ublk -- scripts/common.sh@345 -- # : 1 00:15:45.221 05:06:05 ublk -- scripts/common.sh@364 -- # (( v = 0 )) 00:15:45.221 05:06:05 ublk -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:15:45.221 05:06:05 ublk -- scripts/common.sh@365 -- # decimal 1 00:15:45.221 05:06:05 ublk -- scripts/common.sh@353 -- # local d=1 00:15:45.221 05:06:05 ublk -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:15:45.221 05:06:05 ublk -- scripts/common.sh@355 -- # echo 1 00:15:45.221 05:06:05 ublk -- scripts/common.sh@365 -- # ver1[v]=1 00:15:45.221 05:06:05 ublk -- scripts/common.sh@366 -- # decimal 2 00:15:45.221 05:06:05 ublk -- scripts/common.sh@353 -- # local d=2 00:15:45.221 05:06:05 ublk -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:15:45.221 05:06:05 ublk -- scripts/common.sh@355 -- # echo 2 00:15:45.221 05:06:05 ublk -- scripts/common.sh@366 -- # ver2[v]=2 00:15:45.221 05:06:05 ublk -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:15:45.221 05:06:05 ublk -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:15:45.221 05:06:05 ublk -- scripts/common.sh@368 -- # return 0 00:15:45.221 05:06:05 ublk -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:15:45.221 05:06:05 ublk -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:15:45.221 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:15:45.221 --rc genhtml_branch_coverage=1 00:15:45.221 --rc genhtml_function_coverage=1 00:15:45.221 --rc genhtml_legend=1 00:15:45.221 --rc geninfo_all_blocks=1 00:15:45.221 --rc geninfo_unexecuted_blocks=1 00:15:45.221 00:15:45.221 ' 00:15:45.221 05:06:05 ublk -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:15:45.221 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:15:45.221 --rc genhtml_branch_coverage=1 00:15:45.221 --rc genhtml_function_coverage=1 00:15:45.221 --rc genhtml_legend=1 00:15:45.221 --rc geninfo_all_blocks=1 00:15:45.221 --rc geninfo_unexecuted_blocks=1 00:15:45.221 00:15:45.221 ' 00:15:45.221 05:06:05 ublk -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:15:45.221 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:15:45.221 --rc genhtml_branch_coverage=1 00:15:45.221 --rc genhtml_function_coverage=1 00:15:45.221 --rc genhtml_legend=1 00:15:45.221 --rc geninfo_all_blocks=1 00:15:45.221 --rc geninfo_unexecuted_blocks=1 00:15:45.221 00:15:45.221 ' 00:15:45.221 05:06:05 ublk -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:15:45.221 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:15:45.221 --rc genhtml_branch_coverage=1 00:15:45.221 --rc genhtml_function_coverage=1 00:15:45.221 --rc genhtml_legend=1 00:15:45.221 --rc geninfo_all_blocks=1 00:15:45.221 --rc geninfo_unexecuted_blocks=1 00:15:45.221 00:15:45.221 ' 00:15:45.221 05:06:05 ublk -- ublk/ublk.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/lvol/common.sh 00:15:45.221 05:06:05 ublk -- lvol/common.sh@6 -- # MALLOC_SIZE_MB=128 00:15:45.221 05:06:05 ublk -- lvol/common.sh@7 -- # MALLOC_BS=512 00:15:45.221 05:06:05 ublk -- lvol/common.sh@8 -- # AIO_SIZE_MB=400 00:15:45.221 05:06:05 ublk -- lvol/common.sh@9 -- # AIO_BS=4096 00:15:45.221 05:06:05 ublk -- lvol/common.sh@10 -- # LVS_DEFAULT_CLUSTER_SIZE_MB=4 00:15:45.221 05:06:05 ublk -- lvol/common.sh@11 -- # LVS_DEFAULT_CLUSTER_SIZE=4194304 00:15:45.221 05:06:05 ublk -- lvol/common.sh@13 -- # LVS_DEFAULT_CAPACITY_MB=124 00:15:45.221 05:06:05 ublk -- lvol/common.sh@14 -- # LVS_DEFAULT_CAPACITY=130023424 00:15:45.221 05:06:05 ublk -- ublk/ublk.sh@11 -- # [[ -z '' ]] 00:15:45.221 05:06:05 ublk -- ublk/ublk.sh@12 -- # NUM_DEVS=4 00:15:45.221 05:06:05 ublk -- ublk/ublk.sh@13 -- # NUM_QUEUE=4 00:15:45.221 05:06:05 ublk -- ublk/ublk.sh@14 -- # QUEUE_DEPTH=512 00:15:45.221 05:06:05 ublk -- ublk/ublk.sh@15 -- # MALLOC_SIZE_MB=128 00:15:45.221 05:06:05 ublk -- ublk/ublk.sh@17 -- # STOP_DISKS=1 00:15:45.221 05:06:05 ublk -- ublk/ublk.sh@27 -- # MALLOC_BS=4096 00:15:45.221 05:06:05 ublk -- ublk/ublk.sh@28 -- # FILE_SIZE=134217728 00:15:45.221 05:06:05 ublk -- ublk/ublk.sh@29 -- # MAX_DEV_ID=3 00:15:45.221 05:06:05 ublk -- ublk/ublk.sh@133 -- # modprobe ublk_drv 00:15:45.221 05:06:05 ublk -- ublk/ublk.sh@136 -- # run_test test_save_ublk_config test_save_config 00:15:45.221 05:06:05 ublk -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:15:45.221 05:06:05 ublk -- common/autotest_common.sh@1111 -- # xtrace_disable 00:15:45.221 05:06:05 ublk -- common/autotest_common.sh@10 -- # set +x 00:15:45.221 ************************************ 00:15:45.221 START TEST test_save_ublk_config 00:15:45.221 ************************************ 00:15:45.221 05:06:05 ublk.test_save_ublk_config -- common/autotest_common.sh@1129 -- # test_save_config 00:15:45.221 05:06:05 ublk.test_save_ublk_config -- ublk/ublk.sh@100 -- # local tgtpid blkpath config 00:15:45.221 05:06:05 ublk.test_save_ublk_config -- ublk/ublk.sh@102 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -L ublk 00:15:45.221 05:06:05 ublk.test_save_ublk_config -- ublk/ublk.sh@103 -- # tgtpid=86695 00:15:45.221 05:06:05 ublk.test_save_ublk_config -- ublk/ublk.sh@104 -- # trap 'killprocess $tgtpid' EXIT 00:15:45.221 05:06:05 ublk.test_save_ublk_config -- ublk/ublk.sh@106 -- # waitforlisten 86695 00:15:45.221 05:06:05 ublk.test_save_ublk_config -- common/autotest_common.sh@835 -- # '[' -z 86695 ']' 00:15:45.221 05:06:05 ublk.test_save_ublk_config -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:15:45.221 05:06:05 ublk.test_save_ublk_config -- common/autotest_common.sh@840 -- # local max_retries=100 00:15:45.221 05:06:05 ublk.test_save_ublk_config -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:15:45.221 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:15:45.221 05:06:05 ublk.test_save_ublk_config -- common/autotest_common.sh@844 -- # xtrace_disable 00:15:45.221 05:06:05 ublk.test_save_ublk_config -- common/autotest_common.sh@10 -- # set +x 00:15:45.221 [2024-12-15 05:06:05.272229] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:15:45.221 [2024-12-15 05:06:05.272703] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid86695 ] 00:15:45.482 [2024-12-15 05:06:05.427337] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:45.482 [2024-12-15 05:06:05.456235] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:15:46.054 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:15:46.054 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@868 -- # return 0 00:15:46.054 05:06:06 ublk.test_save_ublk_config -- ublk/ublk.sh@107 -- # blkpath=/dev/ublkb0 00:15:46.054 05:06:06 ublk.test_save_ublk_config -- ublk/ublk.sh@108 -- # rpc_cmd 00:15:46.054 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@563 -- # xtrace_disable 00:15:46.054 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@10 -- # set +x 00:15:46.054 [2024-12-15 05:06:06.135458] ublk.c: 572:ublk_ctrl_cmd_get_features: *NOTICE*: User Copy enabled 00:15:46.054 [2024-12-15 05:06:06.136370] ublk.c: 758:ublk_create_target: *NOTICE*: UBLK target created successfully 00:15:46.054 malloc0 00:15:46.054 [2024-12-15 05:06:06.167579] ublk.c:1924:ublk_start_disk: *DEBUG*: ublk0: bdev malloc0 num_queues 1 queue_depth 128 00:15:46.054 [2024-12-15 05:06:06.167678] ublk.c:1965:ublk_start_disk: *INFO*: Enabling kernel access to bdev malloc0 via ublk 0 00:15:46.054 [2024-12-15 05:06:06.167688] ublk.c: 971:ublk_dev_list_register: *DEBUG*: ublk0: add to tailq 00:15:46.054 [2024-12-15 05:06:06.167706] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_ADD_DEV 00:15:46.054 [2024-12-15 05:06:06.176973] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_ADD_DEV completed 00:15:46.054 [2024-12-15 05:06:06.177013] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_SET_PARAMS 00:15:46.054 [2024-12-15 05:06:06.184453] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_SET_PARAMS completed 00:15:46.054 [2024-12-15 05:06:06.184581] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_START_DEV 00:15:46.315 [2024-12-15 05:06:06.201474] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_START_DEV completed 00:15:46.315 0 00:15:46.315 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:15:46.315 05:06:06 ublk.test_save_ublk_config -- ublk/ublk.sh@115 -- # rpc_cmd save_config 00:15:46.315 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@563 -- # xtrace_disable 00:15:46.315 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@10 -- # set +x 00:15:46.577 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:15:46.577 05:06:06 ublk.test_save_ublk_config -- ublk/ublk.sh@115 -- # config='{ 00:15:46.577 "subsystems": [ 00:15:46.577 { 00:15:46.577 "subsystem": "fsdev", 00:15:46.577 "config": [ 00:15:46.577 { 00:15:46.577 "method": "fsdev_set_opts", 00:15:46.577 "params": { 00:15:46.577 "fsdev_io_pool_size": 65535, 00:15:46.577 "fsdev_io_cache_size": 256 00:15:46.577 } 00:15:46.577 } 00:15:46.577 ] 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "subsystem": "keyring", 00:15:46.577 "config": [] 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "subsystem": "iobuf", 00:15:46.577 "config": [ 00:15:46.577 { 00:15:46.577 "method": "iobuf_set_options", 00:15:46.577 "params": { 00:15:46.577 "small_pool_count": 8192, 00:15:46.577 "large_pool_count": 1024, 00:15:46.577 "small_bufsize": 8192, 00:15:46.577 "large_bufsize": 135168, 00:15:46.577 "enable_numa": false 00:15:46.577 } 00:15:46.577 } 00:15:46.577 ] 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "subsystem": "sock", 00:15:46.577 "config": [ 00:15:46.577 { 00:15:46.577 "method": "sock_set_default_impl", 00:15:46.577 "params": { 00:15:46.577 "impl_name": "posix" 00:15:46.577 } 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "method": "sock_impl_set_options", 00:15:46.577 "params": { 00:15:46.577 "impl_name": "ssl", 00:15:46.577 "recv_buf_size": 4096, 00:15:46.577 "send_buf_size": 4096, 00:15:46.577 "enable_recv_pipe": true, 00:15:46.577 "enable_quickack": false, 00:15:46.577 "enable_placement_id": 0, 00:15:46.577 "enable_zerocopy_send_server": true, 00:15:46.577 "enable_zerocopy_send_client": false, 00:15:46.577 "zerocopy_threshold": 0, 00:15:46.577 "tls_version": 0, 00:15:46.577 "enable_ktls": false 00:15:46.577 } 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "method": "sock_impl_set_options", 00:15:46.577 "params": { 00:15:46.577 "impl_name": "posix", 00:15:46.577 "recv_buf_size": 2097152, 00:15:46.577 "send_buf_size": 2097152, 00:15:46.577 "enable_recv_pipe": true, 00:15:46.577 "enable_quickack": false, 00:15:46.577 "enable_placement_id": 0, 00:15:46.577 "enable_zerocopy_send_server": true, 00:15:46.577 "enable_zerocopy_send_client": false, 00:15:46.577 "zerocopy_threshold": 0, 00:15:46.577 "tls_version": 0, 00:15:46.577 "enable_ktls": false 00:15:46.577 } 00:15:46.577 } 00:15:46.577 ] 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "subsystem": "vmd", 00:15:46.577 "config": [] 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "subsystem": "accel", 00:15:46.577 "config": [ 00:15:46.577 { 00:15:46.577 "method": "accel_set_options", 00:15:46.577 "params": { 00:15:46.577 "small_cache_size": 128, 00:15:46.577 "large_cache_size": 16, 00:15:46.577 "task_count": 2048, 00:15:46.577 "sequence_count": 2048, 00:15:46.577 "buf_count": 2048 00:15:46.577 } 00:15:46.577 } 00:15:46.577 ] 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "subsystem": "bdev", 00:15:46.577 "config": [ 00:15:46.577 { 00:15:46.577 "method": "bdev_set_options", 00:15:46.577 "params": { 00:15:46.577 "bdev_io_pool_size": 65535, 00:15:46.577 "bdev_io_cache_size": 256, 00:15:46.577 "bdev_auto_examine": true, 00:15:46.577 "iobuf_small_cache_size": 128, 00:15:46.577 "iobuf_large_cache_size": 16 00:15:46.577 } 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "method": "bdev_raid_set_options", 00:15:46.577 "params": { 00:15:46.577 "process_window_size_kb": 1024, 00:15:46.577 "process_max_bandwidth_mb_sec": 0 00:15:46.577 } 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "method": "bdev_iscsi_set_options", 00:15:46.577 "params": { 00:15:46.577 "timeout_sec": 30 00:15:46.577 } 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "method": "bdev_nvme_set_options", 00:15:46.577 "params": { 00:15:46.577 "action_on_timeout": "none", 00:15:46.577 "timeout_us": 0, 00:15:46.577 "timeout_admin_us": 0, 00:15:46.577 "keep_alive_timeout_ms": 10000, 00:15:46.577 "arbitration_burst": 0, 00:15:46.577 "low_priority_weight": 0, 00:15:46.577 "medium_priority_weight": 0, 00:15:46.577 "high_priority_weight": 0, 00:15:46.577 "nvme_adminq_poll_period_us": 10000, 00:15:46.577 "nvme_ioq_poll_period_us": 0, 00:15:46.577 "io_queue_requests": 0, 00:15:46.577 "delay_cmd_submit": true, 00:15:46.577 "transport_retry_count": 4, 00:15:46.577 "bdev_retry_count": 3, 00:15:46.577 "transport_ack_timeout": 0, 00:15:46.577 "ctrlr_loss_timeout_sec": 0, 00:15:46.577 "reconnect_delay_sec": 0, 00:15:46.577 "fast_io_fail_timeout_sec": 0, 00:15:46.577 "disable_auto_failback": false, 00:15:46.577 "generate_uuids": false, 00:15:46.577 "transport_tos": 0, 00:15:46.577 "nvme_error_stat": false, 00:15:46.577 "rdma_srq_size": 0, 00:15:46.577 "io_path_stat": false, 00:15:46.577 "allow_accel_sequence": false, 00:15:46.577 "rdma_max_cq_size": 0, 00:15:46.577 "rdma_cm_event_timeout_ms": 0, 00:15:46.577 "dhchap_digests": [ 00:15:46.577 "sha256", 00:15:46.577 "sha384", 00:15:46.577 "sha512" 00:15:46.577 ], 00:15:46.577 "dhchap_dhgroups": [ 00:15:46.577 "null", 00:15:46.577 "ffdhe2048", 00:15:46.577 "ffdhe3072", 00:15:46.577 "ffdhe4096", 00:15:46.577 "ffdhe6144", 00:15:46.577 "ffdhe8192" 00:15:46.577 ], 00:15:46.577 "rdma_umr_per_io": false 00:15:46.577 } 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "method": "bdev_nvme_set_hotplug", 00:15:46.577 "params": { 00:15:46.577 "period_us": 100000, 00:15:46.577 "enable": false 00:15:46.577 } 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "method": "bdev_malloc_create", 00:15:46.577 "params": { 00:15:46.577 "name": "malloc0", 00:15:46.577 "num_blocks": 8192, 00:15:46.577 "block_size": 4096, 00:15:46.577 "physical_block_size": 4096, 00:15:46.577 "uuid": "a1024956-cd66-4008-87e3-10bce845844e", 00:15:46.577 "optimal_io_boundary": 0, 00:15:46.577 "md_size": 0, 00:15:46.577 "dif_type": 0, 00:15:46.577 "dif_is_head_of_md": false, 00:15:46.577 "dif_pi_format": 0 00:15:46.577 } 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "method": "bdev_wait_for_examine" 00:15:46.577 } 00:15:46.577 ] 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "subsystem": "scsi", 00:15:46.577 "config": null 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "subsystem": "scheduler", 00:15:46.577 "config": [ 00:15:46.577 { 00:15:46.577 "method": "framework_set_scheduler", 00:15:46.577 "params": { 00:15:46.577 "name": "static" 00:15:46.577 } 00:15:46.577 } 00:15:46.577 ] 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "subsystem": "vhost_scsi", 00:15:46.577 "config": [] 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "subsystem": "vhost_blk", 00:15:46.577 "config": [] 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "subsystem": "ublk", 00:15:46.577 "config": [ 00:15:46.577 { 00:15:46.577 "method": "ublk_create_target", 00:15:46.577 "params": { 00:15:46.577 "cpumask": "1" 00:15:46.577 } 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "method": "ublk_start_disk", 00:15:46.577 "params": { 00:15:46.577 "bdev_name": "malloc0", 00:15:46.577 "ublk_id": 0, 00:15:46.577 "num_queues": 1, 00:15:46.577 "queue_depth": 128 00:15:46.577 } 00:15:46.577 } 00:15:46.577 ] 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "subsystem": "nbd", 00:15:46.577 "config": [] 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "subsystem": "nvmf", 00:15:46.577 "config": [ 00:15:46.577 { 00:15:46.577 "method": "nvmf_set_config", 00:15:46.577 "params": { 00:15:46.577 "discovery_filter": "match_any", 00:15:46.577 "admin_cmd_passthru": { 00:15:46.577 "identify_ctrlr": false 00:15:46.577 }, 00:15:46.577 "dhchap_digests": [ 00:15:46.577 "sha256", 00:15:46.577 "sha384", 00:15:46.577 "sha512" 00:15:46.577 ], 00:15:46.577 "dhchap_dhgroups": [ 00:15:46.577 "null", 00:15:46.577 "ffdhe2048", 00:15:46.577 "ffdhe3072", 00:15:46.577 "ffdhe4096", 00:15:46.577 "ffdhe6144", 00:15:46.577 "ffdhe8192" 00:15:46.577 ] 00:15:46.577 } 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "method": "nvmf_set_max_subsystems", 00:15:46.577 "params": { 00:15:46.577 "max_subsystems": 1024 00:15:46.577 } 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "method": "nvmf_set_crdt", 00:15:46.577 "params": { 00:15:46.577 "crdt1": 0, 00:15:46.577 "crdt2": 0, 00:15:46.577 "crdt3": 0 00:15:46.577 } 00:15:46.577 } 00:15:46.577 ] 00:15:46.577 }, 00:15:46.577 { 00:15:46.577 "subsystem": "iscsi", 00:15:46.577 "config": [ 00:15:46.577 { 00:15:46.577 "method": "iscsi_set_options", 00:15:46.577 "params": { 00:15:46.577 "node_base": "iqn.2016-06.io.spdk", 00:15:46.577 "max_sessions": 128, 00:15:46.577 "max_connections_per_session": 2, 00:15:46.577 "max_queue_depth": 64, 00:15:46.577 "default_time2wait": 2, 00:15:46.577 "default_time2retain": 20, 00:15:46.577 "first_burst_length": 8192, 00:15:46.577 "immediate_data": true, 00:15:46.577 "allow_duplicated_isid": false, 00:15:46.577 "error_recovery_level": 0, 00:15:46.577 "nop_timeout": 60, 00:15:46.577 "nop_in_interval": 30, 00:15:46.577 "disable_chap": false, 00:15:46.577 "require_chap": false, 00:15:46.577 "mutual_chap": false, 00:15:46.577 "chap_group": 0, 00:15:46.577 "max_large_datain_per_connection": 64, 00:15:46.577 "max_r2t_per_connection": 4, 00:15:46.577 "pdu_pool_size": 36864, 00:15:46.577 "immediate_data_pool_size": 16384, 00:15:46.577 "data_out_pool_size": 2048 00:15:46.577 } 00:15:46.577 } 00:15:46.577 ] 00:15:46.577 } 00:15:46.577 ] 00:15:46.577 }' 00:15:46.577 05:06:06 ublk.test_save_ublk_config -- ublk/ublk.sh@116 -- # killprocess 86695 00:15:46.577 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@954 -- # '[' -z 86695 ']' 00:15:46.577 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@958 -- # kill -0 86695 00:15:46.577 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@959 -- # uname 00:15:46.577 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:15:46.577 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 86695 00:15:46.577 killing process with pid 86695 00:15:46.577 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:15:46.577 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:15:46.577 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@972 -- # echo 'killing process with pid 86695' 00:15:46.577 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@973 -- # kill 86695 00:15:46.577 05:06:06 ublk.test_save_ublk_config -- common/autotest_common.sh@978 -- # wait 86695 00:15:46.838 [2024-12-15 05:06:06.715809] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_STOP_DEV 00:15:46.838 [2024-12-15 05:06:06.745539] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_STOP_DEV completed 00:15:46.838 [2024-12-15 05:06:06.745685] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_DEL_DEV 00:15:46.838 [2024-12-15 05:06:06.752471] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_DEL_DEV completed 00:15:46.838 [2024-12-15 05:06:06.752529] ublk.c: 985:ublk_dev_list_unregister: *DEBUG*: ublk0: remove from tailq 00:15:46.838 [2024-12-15 05:06:06.752538] ublk.c:1819:ublk_free_dev: *NOTICE*: ublk dev 0 stopped 00:15:46.838 [2024-12-15 05:06:06.752570] ublk.c: 835:_ublk_fini: *DEBUG*: finish shutdown 00:15:46.838 [2024-12-15 05:06:06.752713] ublk.c: 766:_ublk_fini_done: *DEBUG*: 00:15:47.098 05:06:07 ublk.test_save_ublk_config -- ublk/ublk.sh@119 -- # tgtpid=86732 00:15:47.098 05:06:07 ublk.test_save_ublk_config -- ublk/ublk.sh@121 -- # waitforlisten 86732 00:15:47.099 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:15:47.099 05:06:07 ublk.test_save_ublk_config -- common/autotest_common.sh@835 -- # '[' -z 86732 ']' 00:15:47.099 05:06:07 ublk.test_save_ublk_config -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:15:47.099 05:06:07 ublk.test_save_ublk_config -- common/autotest_common.sh@840 -- # local max_retries=100 00:15:47.099 05:06:07 ublk.test_save_ublk_config -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:15:47.099 05:06:07 ublk.test_save_ublk_config -- common/autotest_common.sh@844 -- # xtrace_disable 00:15:47.099 05:06:07 ublk.test_save_ublk_config -- common/autotest_common.sh@10 -- # set +x 00:15:47.099 05:06:07 ublk.test_save_ublk_config -- ublk/ublk.sh@118 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -L ublk -c /dev/fd/63 00:15:47.099 05:06:07 ublk.test_save_ublk_config -- ublk/ublk.sh@118 -- # echo '{ 00:15:47.099 "subsystems": [ 00:15:47.099 { 00:15:47.099 "subsystem": "fsdev", 00:15:47.099 "config": [ 00:15:47.099 { 00:15:47.099 "method": "fsdev_set_opts", 00:15:47.099 "params": { 00:15:47.099 "fsdev_io_pool_size": 65535, 00:15:47.099 "fsdev_io_cache_size": 256 00:15:47.099 } 00:15:47.099 } 00:15:47.099 ] 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "subsystem": "keyring", 00:15:47.099 "config": [] 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "subsystem": "iobuf", 00:15:47.099 "config": [ 00:15:47.099 { 00:15:47.099 "method": "iobuf_set_options", 00:15:47.099 "params": { 00:15:47.099 "small_pool_count": 8192, 00:15:47.099 "large_pool_count": 1024, 00:15:47.099 "small_bufsize": 8192, 00:15:47.099 "large_bufsize": 135168, 00:15:47.099 "enable_numa": false 00:15:47.099 } 00:15:47.099 } 00:15:47.099 ] 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "subsystem": "sock", 00:15:47.099 "config": [ 00:15:47.099 { 00:15:47.099 "method": "sock_set_default_impl", 00:15:47.099 "params": { 00:15:47.099 "impl_name": "posix" 00:15:47.099 } 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "method": "sock_impl_set_options", 00:15:47.099 "params": { 00:15:47.099 "impl_name": "ssl", 00:15:47.099 "recv_buf_size": 4096, 00:15:47.099 "send_buf_size": 4096, 00:15:47.099 "enable_recv_pipe": true, 00:15:47.099 "enable_quickack": false, 00:15:47.099 "enable_placement_id": 0, 00:15:47.099 "enable_zerocopy_send_server": true, 00:15:47.099 "enable_zerocopy_send_client": false, 00:15:47.099 "zerocopy_threshold": 0, 00:15:47.099 "tls_version": 0, 00:15:47.099 "enable_ktls": false 00:15:47.099 } 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "method": "sock_impl_set_options", 00:15:47.099 "params": { 00:15:47.099 "impl_name": "posix", 00:15:47.099 "recv_buf_size": 2097152, 00:15:47.099 "send_buf_size": 2097152, 00:15:47.099 "enable_recv_pipe": true, 00:15:47.099 "enable_quickack": false, 00:15:47.099 "enable_placement_id": 0, 00:15:47.099 "enable_zerocopy_send_server": true, 00:15:47.099 "enable_zerocopy_send_client": false, 00:15:47.099 "zerocopy_threshold": 0, 00:15:47.099 "tls_version": 0, 00:15:47.099 "enable_ktls": false 00:15:47.099 } 00:15:47.099 } 00:15:47.099 ] 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "subsystem": "vmd", 00:15:47.099 "config": [] 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "subsystem": "accel", 00:15:47.099 "config": [ 00:15:47.099 { 00:15:47.099 "method": "accel_set_options", 00:15:47.099 "params": { 00:15:47.099 "small_cache_size": 128, 00:15:47.099 "large_cache_size": 16, 00:15:47.099 "task_count": 2048, 00:15:47.099 "sequence_count": 2048, 00:15:47.099 "buf_count": 2048 00:15:47.099 } 00:15:47.099 } 00:15:47.099 ] 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "subsystem": "bdev", 00:15:47.099 "config": [ 00:15:47.099 { 00:15:47.099 "method": "bdev_set_options", 00:15:47.099 "params": { 00:15:47.099 "bdev_io_pool_size": 65535, 00:15:47.099 "bdev_io_cache_size": 256, 00:15:47.099 "bdev_auto_examine": true, 00:15:47.099 "iobuf_small_cache_size": 128, 00:15:47.099 "iobuf_large_cache_size": 16 00:15:47.099 } 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "method": "bdev_raid_set_options", 00:15:47.099 "params": { 00:15:47.099 "process_window_size_kb": 1024, 00:15:47.099 "process_max_bandwidth_mb_sec": 0 00:15:47.099 } 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "method": "bdev_iscsi_set_options", 00:15:47.099 "params": { 00:15:47.099 "timeout_sec": 30 00:15:47.099 } 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "method": "bdev_nvme_set_options", 00:15:47.099 "params": { 00:15:47.099 "action_on_timeout": "none", 00:15:47.099 "timeout_us": 0, 00:15:47.099 "timeout_admin_us": 0, 00:15:47.099 "keep_alive_timeout_ms": 10000, 00:15:47.099 "arbitration_burst": 0, 00:15:47.099 "low_priority_weight": 0, 00:15:47.099 "medium_priority_weight": 0, 00:15:47.099 "high_priority_weight": 0, 00:15:47.099 "nvme_adminq_poll_period_us": 10000, 00:15:47.099 "nvme_ioq_poll_period_us": 0, 00:15:47.099 "io_queue_requests": 0, 00:15:47.099 "delay_cmd_submit": true, 00:15:47.099 "transport_retry_count": 4, 00:15:47.099 "bdev_retry_count": 3, 00:15:47.099 "transport_ack_timeout": 0, 00:15:47.099 "ctrlr_loss_timeout_sec": 0, 00:15:47.099 "reconnect_delay_sec": 0, 00:15:47.099 "fast_io_fail_timeout_sec": 0, 00:15:47.099 "disable_auto_failback": false, 00:15:47.099 "generate_uuids": false, 00:15:47.099 "transport_tos": 0, 00:15:47.099 "nvme_error_stat": false, 00:15:47.099 "rdma_srq_size": 0, 00:15:47.099 "io_path_stat": false, 00:15:47.099 "allow_accel_sequence": false, 00:15:47.099 "rdma_max_cq_size": 0, 00:15:47.099 "rdma_cm_event_timeout_ms": 0, 00:15:47.099 "dhchap_digests": [ 00:15:47.099 "sha256", 00:15:47.099 "sha384", 00:15:47.099 "sha512" 00:15:47.099 ], 00:15:47.099 "dhchap_dhgroups": [ 00:15:47.099 "null", 00:15:47.099 "ffdhe2048", 00:15:47.099 "ffdhe3072", 00:15:47.099 "ffdhe4096", 00:15:47.099 "ffdhe6144", 00:15:47.099 "ffdhe8192" 00:15:47.099 ], 00:15:47.099 "rdma_umr_per_io": false 00:15:47.099 } 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "method": "bdev_nvme_set_hotplug", 00:15:47.099 "params": { 00:15:47.099 "period_us": 100000, 00:15:47.099 "enable": false 00:15:47.099 } 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "method": "bdev_malloc_create", 00:15:47.099 "params": { 00:15:47.099 "name": "malloc0", 00:15:47.099 "num_blocks": 8192, 00:15:47.099 "block_size": 4096, 00:15:47.099 "physical_block_size": 4096, 00:15:47.099 "uuid": "a1024956-cd66-4008-87e3-10bce845844e", 00:15:47.099 "optimal_io_boundary": 0, 00:15:47.099 "md_size": 0, 00:15:47.099 "dif_type": 0, 00:15:47.099 "dif_is_head_of_md": false, 00:15:47.099 "dif_pi_format": 0 00:15:47.099 } 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "method": "bdev_wait_for_examine" 00:15:47.099 } 00:15:47.099 ] 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "subsystem": "scsi", 00:15:47.099 "config": null 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "subsystem": "scheduler", 00:15:47.099 "config": [ 00:15:47.099 { 00:15:47.099 "method": "framework_set_scheduler", 00:15:47.099 "params": { 00:15:47.099 "name": "static" 00:15:47.099 } 00:15:47.099 } 00:15:47.099 ] 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "subsystem": "vhost_scsi", 00:15:47.099 "config": [] 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "subsystem": "vhost_blk", 00:15:47.099 "config": [] 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "subsystem": "ublk", 00:15:47.099 "config": [ 00:15:47.099 { 00:15:47.099 "method": "ublk_create_target", 00:15:47.099 "params": { 00:15:47.099 "cpumask": "1" 00:15:47.099 } 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "method": "ublk_start_disk", 00:15:47.099 "params": { 00:15:47.099 "bdev_name": "malloc0", 00:15:47.099 "ublk_id": 0, 00:15:47.099 "num_queues": 1, 00:15:47.099 "queue_depth": 128 00:15:47.099 } 00:15:47.099 } 00:15:47.099 ] 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "subsystem": "nbd", 00:15:47.099 "config": [] 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "subsystem": "nvmf", 00:15:47.099 "config": [ 00:15:47.099 { 00:15:47.099 "method": "nvmf_set_config", 00:15:47.099 "params": { 00:15:47.099 "discovery_filter": "match_any", 00:15:47.099 "admin_cmd_passthru": { 00:15:47.099 "identify_ctrlr": false 00:15:47.099 }, 00:15:47.099 "dhchap_digests": [ 00:15:47.099 "sha256", 00:15:47.099 "sha384", 00:15:47.099 "sha512" 00:15:47.099 ], 00:15:47.099 "dhchap_dhgroups": [ 00:15:47.099 "null", 00:15:47.099 "ffdhe2048", 00:15:47.099 "ffdhe3072", 00:15:47.099 "ffdhe4096", 00:15:47.099 "ffdhe6144", 00:15:47.099 "ffdhe8192" 00:15:47.099 ] 00:15:47.099 } 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "method": "nvmf_set_max_subsystems", 00:15:47.099 "params": { 00:15:47.099 "max_subsystems": 1024 00:15:47.099 } 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "method": "nvmf_set_crdt", 00:15:47.099 "params": { 00:15:47.099 "crdt1": 0, 00:15:47.099 "crdt2": 0, 00:15:47.099 "crdt3": 0 00:15:47.099 } 00:15:47.099 } 00:15:47.099 ] 00:15:47.099 }, 00:15:47.099 { 00:15:47.099 "subsystem": "iscsi", 00:15:47.099 "config": [ 00:15:47.099 { 00:15:47.100 "method": "iscsi_set_options", 00:15:47.100 "params": { 00:15:47.100 "node_base": "iqn.2016-06.io.spdk", 00:15:47.100 "max_sessions": 128, 00:15:47.100 "max_connections_per_session": 2, 00:15:47.100 "max_queue_depth": 64, 00:15:47.100 "default_time2wait": 2, 00:15:47.100 "default_time2retain": 20, 00:15:47.100 "first_burst_length": 8192, 00:15:47.100 "immediate_data": true, 00:15:47.100 "allow_duplicated_isid": false, 00:15:47.100 "error_recovery_level": 0, 00:15:47.100 "nop_timeout": 60, 00:15:47.100 "nop_in_interval": 30, 00:15:47.100 "disable_chap": false, 00:15:47.100 "require_chap": false, 00:15:47.100 "mutual_chap": false, 00:15:47.100 "chap_group": 0, 00:15:47.100 "max_large_datain_per_connection": 64, 00:15:47.100 "max_r2t_per_connection": 4, 00:15:47.100 "pdu_pool_size": 36864, 00:15:47.100 "immediate_data_pool_size": 16384, 00:15:47.100 "data_out_pool_size": 2048 00:15:47.100 } 00:15:47.100 } 00:15:47.100 ] 00:15:47.100 } 00:15:47.100 ] 00:15:47.100 }' 00:15:47.100 [2024-12-15 05:06:07.178756] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:15:47.100 [2024-12-15 05:06:07.179087] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid86732 ] 00:15:47.361 [2024-12-15 05:06:07.339820] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:47.361 [2024-12-15 05:06:07.359974] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:15:47.622 [2024-12-15 05:06:07.723454] ublk.c: 572:ublk_ctrl_cmd_get_features: *NOTICE*: User Copy enabled 00:15:47.622 [2024-12-15 05:06:07.723835] ublk.c: 758:ublk_create_target: *NOTICE*: UBLK target created successfully 00:15:47.622 [2024-12-15 05:06:07.731601] ublk.c:1924:ublk_start_disk: *DEBUG*: ublk0: bdev malloc0 num_queues 1 queue_depth 128 00:15:47.622 [2024-12-15 05:06:07.731701] ublk.c:1965:ublk_start_disk: *INFO*: Enabling kernel access to bdev malloc0 via ublk 0 00:15:47.622 [2024-12-15 05:06:07.731710] ublk.c: 971:ublk_dev_list_register: *DEBUG*: ublk0: add to tailq 00:15:47.622 [2024-12-15 05:06:07.731720] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_ADD_DEV 00:15:47.622 [2024-12-15 05:06:07.740554] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_ADD_DEV completed 00:15:47.622 [2024-12-15 05:06:07.740589] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_SET_PARAMS 00:15:47.622 [2024-12-15 05:06:07.747473] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_SET_PARAMS completed 00:15:47.622 [2024-12-15 05:06:07.747594] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_START_DEV 00:15:47.882 [2024-12-15 05:06:07.764466] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_START_DEV completed 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@868 -- # return 0 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- ublk/ublk.sh@122 -- # rpc_cmd ublk_get_disks 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@563 -- # xtrace_disable 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@10 -- # set +x 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- ublk/ublk.sh@122 -- # jq -r '.[0].ublk_device' 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- ublk/ublk.sh@122 -- # [[ /dev/ublkb0 == \/\d\e\v\/\u\b\l\k\b\0 ]] 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- ublk/ublk.sh@123 -- # [[ -b /dev/ublkb0 ]] 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- ublk/ublk.sh@125 -- # killprocess 86732 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@954 -- # '[' -z 86732 ']' 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@958 -- # kill -0 86732 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@959 -- # uname 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 86732 00:15:48.143 killing process with pid 86732 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@972 -- # echo 'killing process with pid 86732' 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@973 -- # kill 86732 00:15:48.143 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@978 -- # wait 86732 00:15:48.404 [2024-12-15 05:06:08.297820] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_STOP_DEV 00:15:48.404 [2024-12-15 05:06:08.344554] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_STOP_DEV completed 00:15:48.404 [2024-12-15 05:06:08.344679] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_DEL_DEV 00:15:48.404 [2024-12-15 05:06:08.352469] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_DEL_DEV completed 00:15:48.404 [2024-12-15 05:06:08.352521] ublk.c: 985:ublk_dev_list_unregister: *DEBUG*: ublk0: remove from tailq 00:15:48.404 [2024-12-15 05:06:08.352534] ublk.c:1819:ublk_free_dev: *NOTICE*: ublk dev 0 stopped 00:15:48.404 [2024-12-15 05:06:08.352558] ublk.c: 835:_ublk_fini: *DEBUG*: finish shutdown 00:15:48.404 [2024-12-15 05:06:08.352698] ublk.c: 766:_ublk_fini_done: *DEBUG*: 00:15:48.664 05:06:08 ublk.test_save_ublk_config -- ublk/ublk.sh@126 -- # trap - EXIT 00:15:48.664 00:15:48.664 real 0m3.517s 00:15:48.664 user 0m2.489s 00:15:48.664 sys 0m1.700s 00:15:48.664 ************************************ 00:15:48.664 END TEST test_save_ublk_config 00:15:48.664 ************************************ 00:15:48.664 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@1130 -- # xtrace_disable 00:15:48.664 05:06:08 ublk.test_save_ublk_config -- common/autotest_common.sh@10 -- # set +x 00:15:48.664 05:06:08 ublk -- ublk/ublk.sh@139 -- # spdk_pid=86780 00:15:48.664 05:06:08 ublk -- ublk/ublk.sh@140 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:15:48.664 05:06:08 ublk -- ublk/ublk.sh@141 -- # waitforlisten 86780 00:15:48.664 05:06:08 ublk -- common/autotest_common.sh@835 -- # '[' -z 86780 ']' 00:15:48.664 05:06:08 ublk -- ublk/ublk.sh@138 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x3 -L ublk 00:15:48.664 05:06:08 ublk -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:15:48.664 05:06:08 ublk -- common/autotest_common.sh@840 -- # local max_retries=100 00:15:48.664 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:15:48.664 05:06:08 ublk -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:15:48.664 05:06:08 ublk -- common/autotest_common.sh@844 -- # xtrace_disable 00:15:48.664 05:06:08 ublk -- common/autotest_common.sh@10 -- # set +x 00:15:48.925 [2024-12-15 05:06:08.818492] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:15:48.925 [2024-12-15 05:06:08.818782] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid86780 ] 00:15:48.925 [2024-12-15 05:06:08.976262] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:15:48.925 [2024-12-15 05:06:09.022469] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:15:48.925 [2024-12-15 05:06:09.022523] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:15:49.495 05:06:09 ublk -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:15:49.495 05:06:09 ublk -- common/autotest_common.sh@868 -- # return 0 00:15:49.495 05:06:09 ublk -- ublk/ublk.sh@143 -- # run_test test_create_ublk test_create_ublk 00:15:49.495 05:06:09 ublk -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:15:49.495 05:06:09 ublk -- common/autotest_common.sh@1111 -- # xtrace_disable 00:15:49.495 05:06:09 ublk -- common/autotest_common.sh@10 -- # set +x 00:15:49.755 ************************************ 00:15:49.755 START TEST test_create_ublk 00:15:49.755 ************************************ 00:15:49.755 05:06:09 ublk.test_create_ublk -- common/autotest_common.sh@1129 -- # test_create_ublk 00:15:49.755 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@33 -- # rpc_cmd ublk_create_target 00:15:49.755 05:06:09 ublk.test_create_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:15:49.755 05:06:09 ublk.test_create_ublk -- common/autotest_common.sh@10 -- # set +x 00:15:49.755 [2024-12-15 05:06:09.649453] ublk.c: 572:ublk_ctrl_cmd_get_features: *NOTICE*: User Copy enabled 00:15:49.755 [2024-12-15 05:06:09.650558] ublk.c: 758:ublk_create_target: *NOTICE*: UBLK target created successfully 00:15:49.755 05:06:09 ublk.test_create_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:15:49.755 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@33 -- # ublk_target= 00:15:49.755 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@35 -- # rpc_cmd bdev_malloc_create 128 4096 00:15:49.755 05:06:09 ublk.test_create_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:15:49.755 05:06:09 ublk.test_create_ublk -- common/autotest_common.sh@10 -- # set +x 00:15:49.755 05:06:09 ublk.test_create_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:15:49.755 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@35 -- # malloc_name=Malloc0 00:15:49.755 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@37 -- # rpc_cmd ublk_start_disk Malloc0 0 -q 4 -d 512 00:15:49.755 05:06:09 ublk.test_create_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:15:49.755 05:06:09 ublk.test_create_ublk -- common/autotest_common.sh@10 -- # set +x 00:15:49.755 [2024-12-15 05:06:09.710587] ublk.c:1924:ublk_start_disk: *DEBUG*: ublk0: bdev Malloc0 num_queues 4 queue_depth 512 00:15:49.755 [2024-12-15 05:06:09.710976] ublk.c:1965:ublk_start_disk: *INFO*: Enabling kernel access to bdev Malloc0 via ublk 0 00:15:49.755 [2024-12-15 05:06:09.710990] ublk.c: 971:ublk_dev_list_register: *DEBUG*: ublk0: add to tailq 00:15:49.755 [2024-12-15 05:06:09.710999] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_ADD_DEV 00:15:49.755 [2024-12-15 05:06:09.718473] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_ADD_DEV completed 00:15:49.755 [2024-12-15 05:06:09.718500] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_SET_PARAMS 00:15:49.755 [2024-12-15 05:06:09.726463] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_SET_PARAMS completed 00:15:49.755 [2024-12-15 05:06:09.727087] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_START_DEV 00:15:49.755 [2024-12-15 05:06:09.757469] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_START_DEV completed 00:15:49.755 05:06:09 ublk.test_create_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:15:49.755 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@37 -- # ublk_id=0 00:15:49.755 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@38 -- # ublk_path=/dev/ublkb0 00:15:49.755 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@39 -- # rpc_cmd ublk_get_disks -n 0 00:15:49.755 05:06:09 ublk.test_create_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:15:49.755 05:06:09 ublk.test_create_ublk -- common/autotest_common.sh@10 -- # set +x 00:15:49.755 05:06:09 ublk.test_create_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:15:49.755 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@39 -- # ublk_dev='[ 00:15:49.755 { 00:15:49.755 "ublk_device": "/dev/ublkb0", 00:15:49.755 "id": 0, 00:15:49.755 "queue_depth": 512, 00:15:49.755 "num_queues": 4, 00:15:49.755 "bdev_name": "Malloc0" 00:15:49.755 } 00:15:49.755 ]' 00:15:49.755 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@41 -- # jq -r '.[0].ublk_device' 00:15:49.755 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@41 -- # [[ /dev/ublkb0 = \/\d\e\v\/\u\b\l\k\b\0 ]] 00:15:49.756 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@42 -- # jq -r '.[0].id' 00:15:49.756 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@42 -- # [[ 0 = \0 ]] 00:15:49.756 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@43 -- # jq -r '.[0].queue_depth' 00:15:49.756 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@43 -- # [[ 512 = \5\1\2 ]] 00:15:49.756 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@44 -- # jq -r '.[0].num_queues' 00:15:50.017 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@44 -- # [[ 4 = \4 ]] 00:15:50.017 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@45 -- # jq -r '.[0].bdev_name' 00:15:50.017 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@45 -- # [[ Malloc0 = \M\a\l\l\o\c\0 ]] 00:15:50.017 05:06:09 ublk.test_create_ublk -- ublk/ublk.sh@48 -- # run_fio_test /dev/ublkb0 0 134217728 write 0xcc '--time_based --runtime=10' 00:15:50.017 05:06:09 ublk.test_create_ublk -- lvol/common.sh@40 -- # local file=/dev/ublkb0 00:15:50.017 05:06:09 ublk.test_create_ublk -- lvol/common.sh@41 -- # local offset=0 00:15:50.017 05:06:09 ublk.test_create_ublk -- lvol/common.sh@42 -- # local size=134217728 00:15:50.017 05:06:09 ublk.test_create_ublk -- lvol/common.sh@43 -- # local rw=write 00:15:50.017 05:06:09 ublk.test_create_ublk -- lvol/common.sh@44 -- # local pattern=0xcc 00:15:50.017 05:06:09 ublk.test_create_ublk -- lvol/common.sh@45 -- # local 'extra_params=--time_based --runtime=10' 00:15:50.017 05:06:09 ublk.test_create_ublk -- lvol/common.sh@47 -- # local pattern_template= fio_template= 00:15:50.017 05:06:09 ublk.test_create_ublk -- lvol/common.sh@48 -- # [[ -n 0xcc ]] 00:15:50.017 05:06:09 ublk.test_create_ublk -- lvol/common.sh@49 -- # pattern_template='--do_verify=1 --verify=pattern --verify_pattern=0xcc --verify_state_save=0' 00:15:50.017 05:06:09 ublk.test_create_ublk -- lvol/common.sh@52 -- # fio_template='fio --name=fio_test --filename=/dev/ublkb0 --offset=0 --size=134217728 --rw=write --direct=1 --time_based --runtime=10 --do_verify=1 --verify=pattern --verify_pattern=0xcc --verify_state_save=0' 00:15:50.017 05:06:09 ublk.test_create_ublk -- lvol/common.sh@53 -- # fio --name=fio_test --filename=/dev/ublkb0 --offset=0 --size=134217728 --rw=write --direct=1 --time_based --runtime=10 --do_verify=1 --verify=pattern --verify_pattern=0xcc --verify_state_save=0 00:15:50.017 fio: verification read phase will never start because write phase uses all of runtime 00:15:50.017 fio_test: (g=0): rw=write, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=psync, iodepth=1 00:15:50.017 fio-3.35 00:15:50.017 Starting 1 process 00:16:02.244 00:16:02.244 fio_test: (groupid=0, jobs=1): err= 0: pid=86825: Sun Dec 15 05:06:20 2024 00:16:02.244 write: IOPS=13.3k, BW=52.1MiB/s (54.6MB/s)(521MiB/10001msec); 0 zone resets 00:16:02.244 clat (usec): min=35, max=11638, avg=74.22, stdev=158.21 00:16:02.244 lat (usec): min=35, max=11651, avg=74.64, stdev=158.25 00:16:02.244 clat percentiles (usec): 00:16:02.244 | 1.00th=[ 45], 5.00th=[ 49], 10.00th=[ 52], 20.00th=[ 57], 00:16:02.244 | 30.00th=[ 60], 40.00th=[ 63], 50.00th=[ 67], 60.00th=[ 69], 00:16:02.244 | 70.00th=[ 71], 80.00th=[ 75], 90.00th=[ 80], 95.00th=[ 85], 00:16:02.244 | 99.00th=[ 104], 99.50th=[ 231], 99.90th=[ 3359], 99.95th=[ 3687], 00:16:02.245 | 99.99th=[ 4178] 00:16:02.245 bw ( KiB/s): min=17344, max=74152, per=98.57%, avg=52589.05, stdev=12074.79, samples=19 00:16:02.245 iops : min= 4336, max=18538, avg=13147.26, stdev=3018.70, samples=19 00:16:02.245 lat (usec) : 50=6.99%, 100=91.87%, 250=0.72%, 500=0.11%, 750=0.02% 00:16:02.245 lat (usec) : 1000=0.02% 00:16:02.245 lat (msec) : 2=0.06%, 4=0.19%, 10=0.02%, 20=0.01% 00:16:02.245 cpu : usr=1.61%, sys=11.82%, ctx=133417, majf=0, minf=797 00:16:02.245 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:16:02.245 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:16:02.245 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:16:02.245 issued rwts: total=0,133396,0,0 short=0,0,0,0 dropped=0,0,0,0 00:16:02.245 latency : target=0, window=0, percentile=100.00%, depth=1 00:16:02.245 00:16:02.245 Run status group 0 (all jobs): 00:16:02.245 WRITE: bw=52.1MiB/s (54.6MB/s), 52.1MiB/s-52.1MiB/s (54.6MB/s-54.6MB/s), io=521MiB (546MB), run=10001-10001msec 00:16:02.245 00:16:02.245 Disk stats (read/write): 00:16:02.245 ublkb0: ios=0/131608, merge=0/0, ticks=0/8406, in_queue=8407, util=98.84% 00:16:02.245 05:06:20 ublk.test_create_ublk -- ublk/ublk.sh@51 -- # rpc_cmd ublk_stop_disk 0 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.245 [2024-12-15 05:06:20.165467] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_STOP_DEV 00:16:02.245 [2024-12-15 05:06:20.206033] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_STOP_DEV completed 00:16:02.245 [2024-12-15 05:06:20.206961] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_DEL_DEV 00:16:02.245 [2024-12-15 05:06:20.212472] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_DEL_DEV completed 00:16:02.245 [2024-12-15 05:06:20.212731] ublk.c: 985:ublk_dev_list_unregister: *DEBUG*: ublk0: remove from tailq 00:16:02.245 [2024-12-15 05:06:20.212739] ublk.c:1819:ublk_free_dev: *NOTICE*: ublk dev 0 stopped 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.245 05:06:20 ublk.test_create_ublk -- ublk/ublk.sh@53 -- # NOT rpc_cmd ublk_stop_disk 0 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@652 -- # local es=0 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@654 -- # valid_exec_arg rpc_cmd ublk_stop_disk 0 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@640 -- # local arg=rpc_cmd 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@644 -- # type -t rpc_cmd 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@644 -- # case "$(type -t "$arg")" in 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@655 -- # rpc_cmd ublk_stop_disk 0 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.245 [2024-12-15 05:06:20.228549] ublk.c:1087:ublk_stop_disk: *ERROR*: no ublk dev with ublk_id=0 00:16:02.245 request: 00:16:02.245 { 00:16:02.245 "ublk_id": 0, 00:16:02.245 "method": "ublk_stop_disk", 00:16:02.245 "req_id": 1 00:16:02.245 } 00:16:02.245 Got JSON-RPC error response 00:16:02.245 response: 00:16:02.245 { 00:16:02.245 "code": -19, 00:16:02.245 "message": "No such device" 00:16:02.245 } 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@591 -- # [[ 1 == 0 ]] 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@655 -- # es=1 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@663 -- # (( es > 128 )) 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@674 -- # [[ -n '' ]] 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@679 -- # (( !es == 0 )) 00:16:02.245 05:06:20 ublk.test_create_ublk -- ublk/ublk.sh@54 -- # rpc_cmd ublk_destroy_target 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.245 [2024-12-15 05:06:20.244524] ublk.c: 835:_ublk_fini: *DEBUG*: finish shutdown 00:16:02.245 [2024-12-15 05:06:20.246648] ublk.c: 766:_ublk_fini_done: *DEBUG*: 00:16:02.245 [2024-12-15 05:06:20.246677] ublk_rpc.c: 63:ublk_destroy_target_done: *NOTICE*: ublk target has been destroyed 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.245 05:06:20 ublk.test_create_ublk -- ublk/ublk.sh@56 -- # rpc_cmd bdev_malloc_delete Malloc0 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.245 05:06:20 ublk.test_create_ublk -- ublk/ublk.sh@57 -- # check_leftover_devices 00:16:02.245 05:06:20 ublk.test_create_ublk -- lvol/common.sh@25 -- # rpc_cmd bdev_get_bdevs 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.245 05:06:20 ublk.test_create_ublk -- lvol/common.sh@25 -- # leftover_bdevs='[]' 00:16:02.245 05:06:20 ublk.test_create_ublk -- lvol/common.sh@26 -- # jq length 00:16:02.245 05:06:20 ublk.test_create_ublk -- lvol/common.sh@26 -- # '[' 0 == 0 ']' 00:16:02.245 05:06:20 ublk.test_create_ublk -- lvol/common.sh@27 -- # rpc_cmd bdev_lvol_get_lvstores 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.245 05:06:20 ublk.test_create_ublk -- lvol/common.sh@27 -- # leftover_lvs='[]' 00:16:02.245 05:06:20 ublk.test_create_ublk -- lvol/common.sh@28 -- # jq length 00:16:02.245 ************************************ 00:16:02.245 END TEST test_create_ublk 00:16:02.245 ************************************ 00:16:02.245 05:06:20 ublk.test_create_ublk -- lvol/common.sh@28 -- # '[' 0 == 0 ']' 00:16:02.245 00:16:02.245 real 0m10.788s 00:16:02.245 user 0m0.456s 00:16:02.245 sys 0m1.261s 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@1130 -- # xtrace_disable 00:16:02.245 05:06:20 ublk.test_create_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.245 05:06:20 ublk -- ublk/ublk.sh@144 -- # run_test test_create_multi_ublk test_create_multi_ublk 00:16:02.245 05:06:20 ublk -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:16:02.245 05:06:20 ublk -- common/autotest_common.sh@1111 -- # xtrace_disable 00:16:02.245 05:06:20 ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.245 ************************************ 00:16:02.245 START TEST test_create_multi_ublk 00:16:02.245 ************************************ 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@1129 -- # test_create_multi_ublk 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@62 -- # rpc_cmd ublk_create_target 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.245 [2024-12-15 05:06:20.472455] ublk.c: 572:ublk_ctrl_cmd_get_features: *NOTICE*: User Copy enabled 00:16:02.245 [2024-12-15 05:06:20.473630] ublk.c: 758:ublk_create_target: *NOTICE*: UBLK target created successfully 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@62 -- # ublk_target= 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@64 -- # seq 0 3 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@64 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@66 -- # rpc_cmd bdev_malloc_create -b Malloc0 128 4096 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@66 -- # malloc_name=Malloc0 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@68 -- # rpc_cmd ublk_start_disk Malloc0 0 -q 4 -d 512 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.245 [2024-12-15 05:06:20.568582] ublk.c:1924:ublk_start_disk: *DEBUG*: ublk0: bdev Malloc0 num_queues 4 queue_depth 512 00:16:02.245 [2024-12-15 05:06:20.568901] ublk.c:1965:ublk_start_disk: *INFO*: Enabling kernel access to bdev Malloc0 via ublk 0 00:16:02.245 [2024-12-15 05:06:20.568915] ublk.c: 971:ublk_dev_list_register: *DEBUG*: ublk0: add to tailq 00:16:02.245 [2024-12-15 05:06:20.568922] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_ADD_DEV 00:16:02.245 [2024-12-15 05:06:20.580475] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_ADD_DEV completed 00:16:02.245 [2024-12-15 05:06:20.580494] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_SET_PARAMS 00:16:02.245 [2024-12-15 05:06:20.592468] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_SET_PARAMS completed 00:16:02.245 [2024-12-15 05:06:20.592986] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_START_DEV 00:16:02.245 [2024-12-15 05:06:20.632464] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_START_DEV completed 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@68 -- # ublk_id=0 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@64 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@66 -- # rpc_cmd bdev_malloc_create -b Malloc1 128 4096 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@66 -- # malloc_name=Malloc1 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@68 -- # rpc_cmd ublk_start_disk Malloc1 1 -q 4 -d 512 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.245 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.245 [2024-12-15 05:06:20.740565] ublk.c:1924:ublk_start_disk: *DEBUG*: ublk1: bdev Malloc1 num_queues 4 queue_depth 512 00:16:02.245 [2024-12-15 05:06:20.740883] ublk.c:1965:ublk_start_disk: *INFO*: Enabling kernel access to bdev Malloc1 via ublk 1 00:16:02.245 [2024-12-15 05:06:20.740896] ublk.c: 971:ublk_dev_list_register: *DEBUG*: ublk1: add to tailq 00:16:02.245 [2024-12-15 05:06:20.740903] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_ADD_DEV 00:16:02.245 [2024-12-15 05:06:20.752481] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_ADD_DEV completed 00:16:02.245 [2024-12-15 05:06:20.752501] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_SET_PARAMS 00:16:02.246 [2024-12-15 05:06:20.764457] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_SET_PARAMS completed 00:16:02.246 [2024-12-15 05:06:20.764994] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_START_DEV 00:16:02.246 [2024-12-15 05:06:20.800470] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_START_DEV completed 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@68 -- # ublk_id=1 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@64 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@66 -- # rpc_cmd bdev_malloc_create -b Malloc2 128 4096 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@66 -- # malloc_name=Malloc2 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@68 -- # rpc_cmd ublk_start_disk Malloc2 2 -q 4 -d 512 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.246 [2024-12-15 05:06:20.903565] ublk.c:1924:ublk_start_disk: *DEBUG*: ublk2: bdev Malloc2 num_queues 4 queue_depth 512 00:16:02.246 [2024-12-15 05:06:20.903884] ublk.c:1965:ublk_start_disk: *INFO*: Enabling kernel access to bdev Malloc2 via ublk 2 00:16:02.246 [2024-12-15 05:06:20.903898] ublk.c: 971:ublk_dev_list_register: *DEBUG*: ublk2: add to tailq 00:16:02.246 [2024-12-15 05:06:20.903904] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk2: ctrl cmd UBLK_CMD_ADD_DEV 00:16:02.246 [2024-12-15 05:06:20.915480] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk2: ctrl cmd UBLK_CMD_ADD_DEV completed 00:16:02.246 [2024-12-15 05:06:20.915498] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk2: ctrl cmd UBLK_CMD_SET_PARAMS 00:16:02.246 [2024-12-15 05:06:20.927458] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk2: ctrl cmd UBLK_CMD_SET_PARAMS completed 00:16:02.246 [2024-12-15 05:06:20.927993] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk2: ctrl cmd UBLK_CMD_START_DEV 00:16:02.246 [2024-12-15 05:06:20.940494] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk2: ctrl cmd UBLK_CMD_START_DEV completed 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@68 -- # ublk_id=2 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@64 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- ublk/ublk.sh@66 -- # rpc_cmd bdev_malloc_create -b Malloc3 128 4096 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.246 05:06:20 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@66 -- # malloc_name=Malloc3 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@68 -- # rpc_cmd ublk_start_disk Malloc3 3 -q 4 -d 512 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.246 [2024-12-15 05:06:21.047556] ublk.c:1924:ublk_start_disk: *DEBUG*: ublk3: bdev Malloc3 num_queues 4 queue_depth 512 00:16:02.246 [2024-12-15 05:06:21.047871] ublk.c:1965:ublk_start_disk: *INFO*: Enabling kernel access to bdev Malloc3 via ublk 3 00:16:02.246 [2024-12-15 05:06:21.047885] ublk.c: 971:ublk_dev_list_register: *DEBUG*: ublk3: add to tailq 00:16:02.246 [2024-12-15 05:06:21.047891] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk3: ctrl cmd UBLK_CMD_ADD_DEV 00:16:02.246 [2024-12-15 05:06:21.059469] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk3: ctrl cmd UBLK_CMD_ADD_DEV completed 00:16:02.246 [2024-12-15 05:06:21.059493] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk3: ctrl cmd UBLK_CMD_SET_PARAMS 00:16:02.246 [2024-12-15 05:06:21.071463] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk3: ctrl cmd UBLK_CMD_SET_PARAMS completed 00:16:02.246 [2024-12-15 05:06:21.072014] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk3: ctrl cmd UBLK_CMD_START_DEV 00:16:02.246 [2024-12-15 05:06:21.084480] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk3: ctrl cmd UBLK_CMD_START_DEV completed 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@68 -- # ublk_id=3 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@71 -- # rpc_cmd ublk_get_disks 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@71 -- # ublk_dev='[ 00:16:02.246 { 00:16:02.246 "ublk_device": "/dev/ublkb0", 00:16:02.246 "id": 0, 00:16:02.246 "queue_depth": 512, 00:16:02.246 "num_queues": 4, 00:16:02.246 "bdev_name": "Malloc0" 00:16:02.246 }, 00:16:02.246 { 00:16:02.246 "ublk_device": "/dev/ublkb1", 00:16:02.246 "id": 1, 00:16:02.246 "queue_depth": 512, 00:16:02.246 "num_queues": 4, 00:16:02.246 "bdev_name": "Malloc1" 00:16:02.246 }, 00:16:02.246 { 00:16:02.246 "ublk_device": "/dev/ublkb2", 00:16:02.246 "id": 2, 00:16:02.246 "queue_depth": 512, 00:16:02.246 "num_queues": 4, 00:16:02.246 "bdev_name": "Malloc2" 00:16:02.246 }, 00:16:02.246 { 00:16:02.246 "ublk_device": "/dev/ublkb3", 00:16:02.246 "id": 3, 00:16:02.246 "queue_depth": 512, 00:16:02.246 "num_queues": 4, 00:16:02.246 "bdev_name": "Malloc3" 00:16:02.246 } 00:16:02.246 ]' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@72 -- # seq 0 3 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@72 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@74 -- # jq -r '.[0].ublk_device' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@74 -- # [[ /dev/ublkb0 = \/\d\e\v\/\u\b\l\k\b\0 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@75 -- # jq -r '.[0].id' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@75 -- # [[ 0 = \0 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@76 -- # jq -r '.[0].queue_depth' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@76 -- # [[ 512 = \5\1\2 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@77 -- # jq -r '.[0].num_queues' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@77 -- # [[ 4 = \4 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@78 -- # jq -r '.[0].bdev_name' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@78 -- # [[ Malloc0 = \M\a\l\l\o\c\0 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@72 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@74 -- # jq -r '.[1].ublk_device' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@74 -- # [[ /dev/ublkb1 = \/\d\e\v\/\u\b\l\k\b\1 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@75 -- # jq -r '.[1].id' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@75 -- # [[ 1 = \1 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@76 -- # jq -r '.[1].queue_depth' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@76 -- # [[ 512 = \5\1\2 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@77 -- # jq -r '.[1].num_queues' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@77 -- # [[ 4 = \4 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@78 -- # jq -r '.[1].bdev_name' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@78 -- # [[ Malloc1 = \M\a\l\l\o\c\1 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@72 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@74 -- # jq -r '.[2].ublk_device' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@74 -- # [[ /dev/ublkb2 = \/\d\e\v\/\u\b\l\k\b\2 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@75 -- # jq -r '.[2].id' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@75 -- # [[ 2 = \2 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@76 -- # jq -r '.[2].queue_depth' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@76 -- # [[ 512 = \5\1\2 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@77 -- # jq -r '.[2].num_queues' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@77 -- # [[ 4 = \4 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@78 -- # jq -r '.[2].bdev_name' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@78 -- # [[ Malloc2 = \M\a\l\l\o\c\2 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@72 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@74 -- # jq -r '.[3].ublk_device' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@74 -- # [[ /dev/ublkb3 = \/\d\e\v\/\u\b\l\k\b\3 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@75 -- # jq -r '.[3].id' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@75 -- # [[ 3 = \3 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@76 -- # jq -r '.[3].queue_depth' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@76 -- # [[ 512 = \5\1\2 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@77 -- # jq -r '.[3].num_queues' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@77 -- # [[ 4 = \4 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@78 -- # jq -r '.[3].bdev_name' 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@78 -- # [[ Malloc3 = \M\a\l\l\o\c\3 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@84 -- # [[ 1 = \1 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@85 -- # seq 0 3 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@85 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@86 -- # rpc_cmd ublk_stop_disk 0 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.246 [2024-12-15 05:06:21.773543] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_STOP_DEV 00:16:02.246 [2024-12-15 05:06:21.812018] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_STOP_DEV completed 00:16:02.246 [2024-12-15 05:06:21.813098] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_DEL_DEV 00:16:02.246 [2024-12-15 05:06:21.821458] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk0: ctrl cmd UBLK_CMD_DEL_DEV completed 00:16:02.246 [2024-12-15 05:06:21.821707] ublk.c: 985:ublk_dev_list_unregister: *DEBUG*: ublk0: remove from tailq 00:16:02.246 [2024-12-15 05:06:21.821719] ublk.c:1819:ublk_free_dev: *NOTICE*: ublk dev 0 stopped 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@85 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@86 -- # rpc_cmd ublk_stop_disk 1 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.246 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.246 [2024-12-15 05:06:21.837527] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_STOP_DEV 00:16:02.246 [2024-12-15 05:06:21.871024] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_STOP_DEV completed 00:16:02.247 [2024-12-15 05:06:21.872119] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_DEL_DEV 00:16:02.247 [2024-12-15 05:06:21.877460] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_DEL_DEV completed 00:16:02.247 [2024-12-15 05:06:21.877707] ublk.c: 985:ublk_dev_list_unregister: *DEBUG*: ublk1: remove from tailq 00:16:02.247 [2024-12-15 05:06:21.877718] ublk.c:1819:ublk_free_dev: *NOTICE*: ublk dev 1 stopped 00:16:02.247 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.247 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@85 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.247 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@86 -- # rpc_cmd ublk_stop_disk 2 00:16:02.247 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.247 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.247 [2024-12-15 05:06:21.892544] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk2: ctrl cmd UBLK_CMD_STOP_DEV 00:16:02.247 [2024-12-15 05:06:21.932499] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk2: ctrl cmd UBLK_CMD_STOP_DEV completed 00:16:02.247 [2024-12-15 05:06:21.933207] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk2: ctrl cmd UBLK_CMD_DEL_DEV 00:16:02.247 [2024-12-15 05:06:21.936723] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk2: ctrl cmd UBLK_CMD_DEL_DEV completed 00:16:02.247 [2024-12-15 05:06:21.936968] ublk.c: 985:ublk_dev_list_unregister: *DEBUG*: ublk2: remove from tailq 00:16:02.247 [2024-12-15 05:06:21.936979] ublk.c:1819:ublk_free_dev: *NOTICE*: ublk dev 2 stopped 00:16:02.247 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.247 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@85 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.247 05:06:21 ublk.test_create_multi_ublk -- ublk/ublk.sh@86 -- # rpc_cmd ublk_stop_disk 3 00:16:02.247 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.247 05:06:21 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.247 [2024-12-15 05:06:21.955518] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk3: ctrl cmd UBLK_CMD_STOP_DEV 00:16:02.247 [2024-12-15 05:06:21.995488] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk3: ctrl cmd UBLK_CMD_STOP_DEV completed 00:16:02.247 [2024-12-15 05:06:21.996199] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk3: ctrl cmd UBLK_CMD_DEL_DEV 00:16:02.247 [2024-12-15 05:06:22.003478] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk3: ctrl cmd UBLK_CMD_DEL_DEV completed 00:16:02.247 [2024-12-15 05:06:22.003714] ublk.c: 985:ublk_dev_list_unregister: *DEBUG*: ublk3: remove from tailq 00:16:02.247 [2024-12-15 05:06:22.003724] ublk.c:1819:ublk_free_dev: *NOTICE*: ublk dev 3 stopped 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- ublk/ublk.sh@91 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -t 120 ublk_destroy_target 00:16:02.247 [2024-12-15 05:06:22.203527] ublk.c: 835:_ublk_fini: *DEBUG*: finish shutdown 00:16:02.247 [2024-12-15 05:06:22.205626] ublk.c: 766:_ublk_fini_done: *DEBUG*: 00:16:02.247 [2024-12-15 05:06:22.205659] ublk_rpc.c: 63:ublk_destroy_target_done: *NOTICE*: ublk target has been destroyed 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- ublk/ublk.sh@93 -- # seq 0 3 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- ublk/ublk.sh@93 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- ublk/ublk.sh@94 -- # rpc_cmd bdev_malloc_delete Malloc0 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- ublk/ublk.sh@93 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- ublk/ublk.sh@94 -- # rpc_cmd bdev_malloc_delete Malloc1 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- ublk/ublk.sh@93 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- ublk/ublk.sh@94 -- # rpc_cmd bdev_malloc_delete Malloc2 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.247 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- ublk/ublk.sh@93 -- # for i in $(seq 0 $MAX_DEV_ID) 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- ublk/ublk.sh@94 -- # rpc_cmd bdev_malloc_delete Malloc3 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- ublk/ublk.sh@96 -- # check_leftover_devices 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- lvol/common.sh@25 -- # rpc_cmd bdev_get_bdevs 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- lvol/common.sh@25 -- # leftover_bdevs='[]' 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- lvol/common.sh@26 -- # jq length 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- lvol/common.sh@26 -- # '[' 0 == 0 ']' 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- lvol/common.sh@27 -- # rpc_cmd bdev_lvol_get_lvstores 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- lvol/common.sh@27 -- # leftover_lvs='[]' 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- lvol/common.sh@28 -- # jq length 00:16:02.505 ************************************ 00:16:02.505 END TEST test_create_multi_ublk 00:16:02.505 ************************************ 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- lvol/common.sh@28 -- # '[' 0 == 0 ']' 00:16:02.505 00:16:02.505 real 0m2.155s 00:16:02.505 user 0m0.822s 00:16:02.505 sys 0m0.143s 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@1130 -- # xtrace_disable 00:16:02.505 05:06:22 ublk.test_create_multi_ublk -- common/autotest_common.sh@10 -- # set +x 00:16:02.763 05:06:22 ublk -- ublk/ublk.sh@146 -- # trap - SIGINT SIGTERM EXIT 00:16:02.763 05:06:22 ublk -- ublk/ublk.sh@147 -- # cleanup 00:16:02.763 05:06:22 ublk -- ublk/ublk.sh@130 -- # killprocess 86780 00:16:02.763 05:06:22 ublk -- common/autotest_common.sh@954 -- # '[' -z 86780 ']' 00:16:02.763 05:06:22 ublk -- common/autotest_common.sh@958 -- # kill -0 86780 00:16:02.763 05:06:22 ublk -- common/autotest_common.sh@959 -- # uname 00:16:02.763 05:06:22 ublk -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:16:02.763 05:06:22 ublk -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 86780 00:16:02.763 killing process with pid 86780 00:16:02.763 05:06:22 ublk -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:16:02.763 05:06:22 ublk -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:16:02.763 05:06:22 ublk -- common/autotest_common.sh@972 -- # echo 'killing process with pid 86780' 00:16:02.763 05:06:22 ublk -- common/autotest_common.sh@973 -- # kill 86780 00:16:02.763 05:06:22 ublk -- common/autotest_common.sh@978 -- # wait 86780 00:16:02.763 [2024-12-15 05:06:22.900972] ublk.c: 835:_ublk_fini: *DEBUG*: finish shutdown 00:16:02.763 [2024-12-15 05:06:22.901042] ublk.c: 766:_ublk_fini_done: *DEBUG*: 00:16:03.022 ************************************ 00:16:03.022 END TEST ublk 00:16:03.022 ************************************ 00:16:03.022 00:16:03.022 real 0m18.155s 00:16:03.022 user 0m27.837s 00:16:03.022 sys 0m7.889s 00:16:03.022 05:06:23 ublk -- common/autotest_common.sh@1130 -- # xtrace_disable 00:16:03.022 05:06:23 ublk -- common/autotest_common.sh@10 -- # set +x 00:16:03.284 05:06:23 -- spdk/autotest.sh@248 -- # run_test ublk_recovery /home/vagrant/spdk_repo/spdk/test/ublk/ublk_recovery.sh 00:16:03.284 05:06:23 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:16:03.284 05:06:23 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:16:03.284 05:06:23 -- common/autotest_common.sh@10 -- # set +x 00:16:03.284 ************************************ 00:16:03.284 START TEST ublk_recovery 00:16:03.284 ************************************ 00:16:03.284 05:06:23 ublk_recovery -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/ublk/ublk_recovery.sh 00:16:03.284 * Looking for test storage... 00:16:03.284 * Found test storage at /home/vagrant/spdk_repo/spdk/test/ublk 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@1711 -- # lcov --version 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@333 -- # local ver1 ver1_l 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@334 -- # local ver2 ver2_l 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@336 -- # IFS=.-: 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@336 -- # read -ra ver1 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@337 -- # IFS=.-: 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@337 -- # read -ra ver2 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@338 -- # local 'op=<' 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@340 -- # ver1_l=2 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@341 -- # ver2_l=1 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@344 -- # case "$op" in 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@345 -- # : 1 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@364 -- # (( v = 0 )) 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@365 -- # decimal 1 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@353 -- # local d=1 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@355 -- # echo 1 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@365 -- # ver1[v]=1 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@366 -- # decimal 2 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@353 -- # local d=2 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@355 -- # echo 2 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@366 -- # ver2[v]=2 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:16:03.285 05:06:23 ublk_recovery -- scripts/common.sh@368 -- # return 0 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:16:03.285 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:03.285 --rc genhtml_branch_coverage=1 00:16:03.285 --rc genhtml_function_coverage=1 00:16:03.285 --rc genhtml_legend=1 00:16:03.285 --rc geninfo_all_blocks=1 00:16:03.285 --rc geninfo_unexecuted_blocks=1 00:16:03.285 00:16:03.285 ' 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:16:03.285 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:03.285 --rc genhtml_branch_coverage=1 00:16:03.285 --rc genhtml_function_coverage=1 00:16:03.285 --rc genhtml_legend=1 00:16:03.285 --rc geninfo_all_blocks=1 00:16:03.285 --rc geninfo_unexecuted_blocks=1 00:16:03.285 00:16:03.285 ' 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:16:03.285 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:03.285 --rc genhtml_branch_coverage=1 00:16:03.285 --rc genhtml_function_coverage=1 00:16:03.285 --rc genhtml_legend=1 00:16:03.285 --rc geninfo_all_blocks=1 00:16:03.285 --rc geninfo_unexecuted_blocks=1 00:16:03.285 00:16:03.285 ' 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:16:03.285 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:16:03.285 --rc genhtml_branch_coverage=1 00:16:03.285 --rc genhtml_function_coverage=1 00:16:03.285 --rc genhtml_legend=1 00:16:03.285 --rc geninfo_all_blocks=1 00:16:03.285 --rc geninfo_unexecuted_blocks=1 00:16:03.285 00:16:03.285 ' 00:16:03.285 05:06:23 ublk_recovery -- ublk/ublk_recovery.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/lvol/common.sh 00:16:03.285 05:06:23 ublk_recovery -- lvol/common.sh@6 -- # MALLOC_SIZE_MB=128 00:16:03.285 05:06:23 ublk_recovery -- lvol/common.sh@7 -- # MALLOC_BS=512 00:16:03.285 05:06:23 ublk_recovery -- lvol/common.sh@8 -- # AIO_SIZE_MB=400 00:16:03.285 05:06:23 ublk_recovery -- lvol/common.sh@9 -- # AIO_BS=4096 00:16:03.285 05:06:23 ublk_recovery -- lvol/common.sh@10 -- # LVS_DEFAULT_CLUSTER_SIZE_MB=4 00:16:03.285 05:06:23 ublk_recovery -- lvol/common.sh@11 -- # LVS_DEFAULT_CLUSTER_SIZE=4194304 00:16:03.285 05:06:23 ublk_recovery -- lvol/common.sh@13 -- # LVS_DEFAULT_CAPACITY_MB=124 00:16:03.285 05:06:23 ublk_recovery -- lvol/common.sh@14 -- # LVS_DEFAULT_CAPACITY=130023424 00:16:03.285 05:06:23 ublk_recovery -- ublk/ublk_recovery.sh@11 -- # modprobe ublk_drv 00:16:03.285 05:06:23 ublk_recovery -- ublk/ublk_recovery.sh@19 -- # spdk_pid=87149 00:16:03.285 05:06:23 ublk_recovery -- ublk/ublk_recovery.sh@20 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:16:03.285 05:06:23 ublk_recovery -- ublk/ublk_recovery.sh@21 -- # waitforlisten 87149 00:16:03.285 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@835 -- # '[' -z 87149 ']' 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:16:03.285 05:06:23 ublk_recovery -- ublk/ublk_recovery.sh@18 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x3 -L ublk 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@840 -- # local max_retries=100 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@844 -- # xtrace_disable 00:16:03.285 05:06:23 ublk_recovery -- common/autotest_common.sh@10 -- # set +x 00:16:03.546 [2024-12-15 05:06:23.452125] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:16:03.546 [2024-12-15 05:06:23.452262] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid87149 ] 00:16:03.546 [2024-12-15 05:06:23.611383] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:16:03.546 [2024-12-15 05:06:23.638020] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:16:03.546 [2024-12-15 05:06:23.638081] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:16:04.156 05:06:24 ublk_recovery -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:16:04.156 05:06:24 ublk_recovery -- common/autotest_common.sh@868 -- # return 0 00:16:04.156 05:06:24 ublk_recovery -- ublk/ublk_recovery.sh@23 -- # rpc_cmd ublk_create_target 00:16:04.156 05:06:24 ublk_recovery -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:04.156 05:06:24 ublk_recovery -- common/autotest_common.sh@10 -- # set +x 00:16:04.156 [2024-12-15 05:06:24.200456] ublk.c: 572:ublk_ctrl_cmd_get_features: *NOTICE*: User Copy enabled 00:16:04.156 [2024-12-15 05:06:24.201844] ublk.c: 758:ublk_create_target: *NOTICE*: UBLK target created successfully 00:16:04.156 05:06:24 ublk_recovery -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:04.156 05:06:24 ublk_recovery -- ublk/ublk_recovery.sh@24 -- # rpc_cmd bdev_malloc_create -b malloc0 64 4096 00:16:04.156 05:06:24 ublk_recovery -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:04.156 05:06:24 ublk_recovery -- common/autotest_common.sh@10 -- # set +x 00:16:04.156 malloc0 00:16:04.156 05:06:24 ublk_recovery -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:04.156 05:06:24 ublk_recovery -- ublk/ublk_recovery.sh@25 -- # rpc_cmd ublk_start_disk malloc0 1 -q 2 -d 128 00:16:04.156 05:06:24 ublk_recovery -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:04.156 05:06:24 ublk_recovery -- common/autotest_common.sh@10 -- # set +x 00:16:04.156 [2024-12-15 05:06:24.248922] ublk.c:1924:ublk_start_disk: *DEBUG*: ublk1: bdev malloc0 num_queues 2 queue_depth 128 00:16:04.156 [2024-12-15 05:06:24.249042] ublk.c:1965:ublk_start_disk: *INFO*: Enabling kernel access to bdev malloc0 via ublk 1 00:16:04.156 [2024-12-15 05:06:24.249050] ublk.c: 971:ublk_dev_list_register: *DEBUG*: ublk1: add to tailq 00:16:04.156 [2024-12-15 05:06:24.249060] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_ADD_DEV 00:16:04.156 [2024-12-15 05:06:24.256477] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_ADD_DEV completed 00:16:04.156 [2024-12-15 05:06:24.256505] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_SET_PARAMS 00:16:04.156 [2024-12-15 05:06:24.264474] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_SET_PARAMS completed 00:16:04.156 [2024-12-15 05:06:24.264621] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_START_DEV 00:16:04.436 [2024-12-15 05:06:24.286468] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_START_DEV completed 00:16:04.436 1 00:16:04.436 05:06:24 ublk_recovery -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:04.436 05:06:24 ublk_recovery -- ublk/ublk_recovery.sh@27 -- # sleep 1 00:16:05.378 05:06:25 ublk_recovery -- ublk/ublk_recovery.sh@31 -- # fio_proc=87182 00:16:05.378 05:06:25 ublk_recovery -- ublk/ublk_recovery.sh@33 -- # sleep 5 00:16:05.378 05:06:25 ublk_recovery -- ublk/ublk_recovery.sh@30 -- # taskset -c 2-3 fio --name=fio_test --filename=/dev/ublkb1 --numjobs=1 --iodepth=128 --ioengine=libaio --rw=randrw --direct=1 --time_based --runtime=60 00:16:05.378 fio_test: (g=0): rw=randrw, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=libaio, iodepth=128 00:16:05.378 fio-3.35 00:16:05.378 Starting 1 process 00:16:10.644 05:06:30 ublk_recovery -- ublk/ublk_recovery.sh@36 -- # kill -9 87149 00:16:10.644 05:06:30 ublk_recovery -- ublk/ublk_recovery.sh@38 -- # sleep 5 00:16:15.933 /home/vagrant/spdk_repo/spdk/test/ublk/ublk_recovery.sh: line 38: 87149 Killed "$SPDK_BIN_DIR/spdk_tgt" -m 0x3 -L ublk 00:16:15.933 05:06:35 ublk_recovery -- ublk/ublk_recovery.sh@42 -- # spdk_pid=87286 00:16:15.933 05:06:35 ublk_recovery -- ublk/ublk_recovery.sh@43 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:16:15.933 05:06:35 ublk_recovery -- ublk/ublk_recovery.sh@44 -- # waitforlisten 87286 00:16:15.933 05:06:35 ublk_recovery -- ublk/ublk_recovery.sh@41 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x3 -L ublk 00:16:15.933 05:06:35 ublk_recovery -- common/autotest_common.sh@835 -- # '[' -z 87286 ']' 00:16:15.933 05:06:35 ublk_recovery -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:16:15.933 05:06:35 ublk_recovery -- common/autotest_common.sh@840 -- # local max_retries=100 00:16:15.933 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:16:15.933 05:06:35 ublk_recovery -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:16:15.933 05:06:35 ublk_recovery -- common/autotest_common.sh@844 -- # xtrace_disable 00:16:15.933 05:06:35 ublk_recovery -- common/autotest_common.sh@10 -- # set +x 00:16:15.933 [2024-12-15 05:06:35.387721] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:16:15.933 [2024-12-15 05:06:35.387856] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid87286 ] 00:16:15.933 [2024-12-15 05:06:35.542620] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 2 00:16:15.933 [2024-12-15 05:06:35.574964] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:16:15.933 [2024-12-15 05:06:35.575032] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:16:16.194 05:06:36 ublk_recovery -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:16:16.194 05:06:36 ublk_recovery -- common/autotest_common.sh@868 -- # return 0 00:16:16.194 05:06:36 ublk_recovery -- ublk/ublk_recovery.sh@47 -- # rpc_cmd ublk_create_target 00:16:16.194 05:06:36 ublk_recovery -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:16.194 05:06:36 ublk_recovery -- common/autotest_common.sh@10 -- # set +x 00:16:16.194 [2024-12-15 05:06:36.244459] ublk.c: 572:ublk_ctrl_cmd_get_features: *NOTICE*: User Copy enabled 00:16:16.194 [2024-12-15 05:06:36.246099] ublk.c: 758:ublk_create_target: *NOTICE*: UBLK target created successfully 00:16:16.194 05:06:36 ublk_recovery -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:16.194 05:06:36 ublk_recovery -- ublk/ublk_recovery.sh@48 -- # rpc_cmd bdev_malloc_create -b malloc0 64 4096 00:16:16.194 05:06:36 ublk_recovery -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:16.194 05:06:36 ublk_recovery -- common/autotest_common.sh@10 -- # set +x 00:16:16.194 malloc0 00:16:16.194 05:06:36 ublk_recovery -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:16.194 05:06:36 ublk_recovery -- ublk/ublk_recovery.sh@49 -- # rpc_cmd ublk_recover_disk malloc0 1 00:16:16.194 05:06:36 ublk_recovery -- common/autotest_common.sh@563 -- # xtrace_disable 00:16:16.194 05:06:36 ublk_recovery -- common/autotest_common.sh@10 -- # set +x 00:16:16.194 [2024-12-15 05:06:36.300614] ublk.c:2106:ublk_start_disk_recovery: *NOTICE*: Recovering ublk 1 with bdev malloc0 00:16:16.194 [2024-12-15 05:06:36.300673] ublk.c: 971:ublk_dev_list_register: *DEBUG*: ublk1: add to tailq 00:16:16.194 [2024-12-15 05:06:36.300683] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_GET_DEV_INFO 00:16:16.194 [2024-12-15 05:06:36.308509] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_GET_DEV_INFO completed 00:16:16.194 [2024-12-15 05:06:36.308537] ublk.c: 391:ublk_ctrl_process_cqe: *DEBUG*: ublk1: Ublk 1 device state 1 00:16:16.194 1 00:16:16.194 05:06:36 ublk_recovery -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:16:16.194 05:06:36 ublk_recovery -- ublk/ublk_recovery.sh@52 -- # wait 87182 00:16:17.579 [2024-12-15 05:06:37.308570] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_GET_DEV_INFO 00:16:17.579 [2024-12-15 05:06:37.315451] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_GET_DEV_INFO completed 00:16:17.579 [2024-12-15 05:06:37.315470] ublk.c: 391:ublk_ctrl_process_cqe: *DEBUG*: ublk1: Ublk 1 device state 1 00:16:18.522 [2024-12-15 05:06:38.315484] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_GET_DEV_INFO 00:16:18.522 [2024-12-15 05:06:38.319457] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_GET_DEV_INFO completed 00:16:18.522 [2024-12-15 05:06:38.319470] ublk.c: 391:ublk_ctrl_process_cqe: *DEBUG*: ublk1: Ublk 1 device state 1 00:16:19.463 [2024-12-15 05:06:39.319487] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_GET_DEV_INFO 00:16:19.463 [2024-12-15 05:06:39.327448] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_GET_DEV_INFO completed 00:16:19.463 [2024-12-15 05:06:39.327464] ublk.c: 391:ublk_ctrl_process_cqe: *DEBUG*: ublk1: Ublk 1 device state 1 00:16:19.464 [2024-12-15 05:06:39.327470] ublk.c:2035:ublk_ctrl_start_recovery: *DEBUG*: Recovering ublk 1, num queues 2, queue depth 128, flags 0xda 00:16:19.464 [2024-12-15 05:06:39.327535] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_START_USER_RECOVERY 00:16:41.421 [2024-12-15 05:07:00.667455] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_START_USER_RECOVERY completed 00:16:41.421 [2024-12-15 05:07:00.675062] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_END_USER_RECOVERY 00:16:41.421 [2024-12-15 05:07:00.681637] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_END_USER_RECOVERY completed 00:16:41.421 [2024-12-15 05:07:00.681655] ublk.c: 413:ublk_ctrl_process_cqe: *NOTICE*: Ublk 1 recover done successfully 00:17:08.113 00:17:08.113 fio_test: (groupid=0, jobs=1): err= 0: pid=87185: Sun Dec 15 05:07:25 2024 00:17:08.113 read: IOPS=14.9k, BW=58.2MiB/s (61.0MB/s)(3493MiB/60001msec) 00:17:08.113 slat (nsec): min=1103, max=125434, avg=4888.31, stdev=1367.05 00:17:08.113 clat (usec): min=781, max=30391k, avg=4490.23, stdev=270774.87 00:17:08.113 lat (usec): min=786, max=30391k, avg=4495.12, stdev=270774.88 00:17:08.113 clat percentiles (usec): 00:17:08.113 | 1.00th=[ 1713], 5.00th=[ 1795], 10.00th=[ 1844], 20.00th=[ 1876], 00:17:08.113 | 30.00th=[ 1893], 40.00th=[ 1909], 50.00th=[ 1926], 60.00th=[ 1942], 00:17:08.113 | 70.00th=[ 1958], 80.00th=[ 2024], 90.00th=[ 2147], 95.00th=[ 3064], 00:17:08.113 | 99.00th=[ 5342], 99.50th=[ 5866], 99.90th=[ 8717], 99.95th=[12649], 00:17:08.113 | 99.99th=[13173] 00:17:08.113 bw ( KiB/s): min=39264, max=133720, per=100.00%, avg=119253.83, stdev=18668.48, samples=59 00:17:08.113 iops : min= 9816, max=33430, avg=29813.46, stdev=4667.12, samples=59 00:17:08.113 write: IOPS=14.9k, BW=58.1MiB/s (61.0MB/s)(3488MiB/60001msec); 0 zone resets 00:17:08.113 slat (nsec): min=1159, max=295404, avg=4962.47, stdev=1455.27 00:17:08.113 clat (usec): min=774, max=30391k, avg=4094.12, stdev=242790.19 00:17:08.113 lat (usec): min=779, max=30391k, avg=4099.08, stdev=242790.20 00:17:08.113 clat percentiles (usec): 00:17:08.114 | 1.00th=[ 1762], 5.00th=[ 1876], 10.00th=[ 1926], 20.00th=[ 1958], 00:17:08.114 | 30.00th=[ 1975], 40.00th=[ 1991], 50.00th=[ 2008], 60.00th=[ 2024], 00:17:08.114 | 70.00th=[ 2057], 80.00th=[ 2114], 90.00th=[ 2245], 95.00th=[ 2966], 00:17:08.114 | 99.00th=[ 5407], 99.50th=[ 5997], 99.90th=[ 8848], 99.95th=[12518], 00:17:08.114 | 99.99th=[13304] 00:17:08.114 bw ( KiB/s): min=40056, max=132856, per=100.00%, avg=119065.22, stdev=18371.93, samples=59 00:17:08.114 iops : min=10014, max=33214, avg=29766.31, stdev=4592.98, samples=59 00:17:08.114 lat (usec) : 1000=0.01% 00:17:08.114 lat (msec) : 2=60.57%, 4=36.46%, 10=2.89%, 20=0.07%, >=2000=0.01% 00:17:08.114 cpu : usr=3.31%, sys=14.88%, ctx=58914, majf=0, minf=13 00:17:08.114 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.1%, >=64=100.0% 00:17:08.114 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:17:08.114 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:17:08.114 issued rwts: total=894145,892862,0,0 short=0,0,0,0 dropped=0,0,0,0 00:17:08.114 latency : target=0, window=0, percentile=100.00%, depth=128 00:17:08.114 00:17:08.114 Run status group 0 (all jobs): 00:17:08.114 READ: bw=58.2MiB/s (61.0MB/s), 58.2MiB/s-58.2MiB/s (61.0MB/s-61.0MB/s), io=3493MiB (3662MB), run=60001-60001msec 00:17:08.114 WRITE: bw=58.1MiB/s (61.0MB/s), 58.1MiB/s-58.1MiB/s (61.0MB/s-61.0MB/s), io=3488MiB (3657MB), run=60001-60001msec 00:17:08.114 00:17:08.114 Disk stats (read/write): 00:17:08.114 ublkb1: ios=890802/889485, merge=0/0, ticks=3963584/3531837, in_queue=7495421, util=99.88% 00:17:08.114 05:07:25 ublk_recovery -- ublk/ublk_recovery.sh@55 -- # rpc_cmd ublk_stop_disk 1 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@563 -- # xtrace_disable 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@10 -- # set +x 00:17:08.114 [2024-12-15 05:07:25.544633] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_STOP_DEV 00:17:08.114 [2024-12-15 05:07:25.588470] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_STOP_DEV completed 00:17:08.114 [2024-12-15 05:07:25.588628] ublk.c: 469:ublk_ctrl_cmd_submit: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_DEL_DEV 00:17:08.114 [2024-12-15 05:07:25.592683] ublk.c: 349:ublk_ctrl_process_cqe: *DEBUG*: ublk1: ctrl cmd UBLK_CMD_DEL_DEV completed 00:17:08.114 [2024-12-15 05:07:25.592781] ublk.c: 985:ublk_dev_list_unregister: *DEBUG*: ublk1: remove from tailq 00:17:08.114 [2024-12-15 05:07:25.592789] ublk.c:1819:ublk_free_dev: *NOTICE*: ublk dev 1 stopped 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:17:08.114 05:07:25 ublk_recovery -- ublk/ublk_recovery.sh@56 -- # rpc_cmd ublk_destroy_target 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@563 -- # xtrace_disable 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@10 -- # set +x 00:17:08.114 [2024-12-15 05:07:25.611519] ublk.c: 835:_ublk_fini: *DEBUG*: finish shutdown 00:17:08.114 [2024-12-15 05:07:25.612880] ublk.c: 766:_ublk_fini_done: *DEBUG*: 00:17:08.114 [2024-12-15 05:07:25.612908] ublk_rpc.c: 63:ublk_destroy_target_done: *NOTICE*: ublk target has been destroyed 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@591 -- # [[ 0 == 0 ]] 00:17:08.114 05:07:25 ublk_recovery -- ublk/ublk_recovery.sh@58 -- # trap - SIGINT SIGTERM EXIT 00:17:08.114 05:07:25 ublk_recovery -- ublk/ublk_recovery.sh@59 -- # cleanup 00:17:08.114 05:07:25 ublk_recovery -- ublk/ublk_recovery.sh@14 -- # killprocess 87286 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@954 -- # '[' -z 87286 ']' 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@958 -- # kill -0 87286 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@959 -- # uname 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 87286 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:17:08.114 killing process with pid 87286 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@972 -- # echo 'killing process with pid 87286' 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@973 -- # kill 87286 00:17:08.114 05:07:25 ublk_recovery -- common/autotest_common.sh@978 -- # wait 87286 00:17:08.114 [2024-12-15 05:07:25.801071] ublk.c: 835:_ublk_fini: *DEBUG*: finish shutdown 00:17:08.114 [2024-12-15 05:07:25.801131] ublk.c: 766:_ublk_fini_done: *DEBUG*: 00:17:08.114 00:17:08.114 real 1m2.863s 00:17:08.114 user 1m45.512s 00:17:08.114 sys 0m20.532s 00:17:08.114 ************************************ 00:17:08.114 END TEST ublk_recovery 00:17:08.114 ************************************ 00:17:08.114 05:07:26 ublk_recovery -- common/autotest_common.sh@1130 -- # xtrace_disable 00:17:08.114 05:07:26 ublk_recovery -- common/autotest_common.sh@10 -- # set +x 00:17:08.114 05:07:26 -- spdk/autotest.sh@251 -- # [[ 0 -eq 1 ]] 00:17:08.114 05:07:26 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:17:08.114 05:07:26 -- spdk/autotest.sh@260 -- # timing_exit lib 00:17:08.114 05:07:26 -- common/autotest_common.sh@732 -- # xtrace_disable 00:17:08.114 05:07:26 -- common/autotest_common.sh@10 -- # set +x 00:17:08.114 05:07:26 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:17:08.114 05:07:26 -- spdk/autotest.sh@267 -- # '[' 0 -eq 1 ']' 00:17:08.114 05:07:26 -- spdk/autotest.sh@276 -- # '[' 0 -eq 1 ']' 00:17:08.114 05:07:26 -- spdk/autotest.sh@311 -- # '[' 0 -eq 1 ']' 00:17:08.114 05:07:26 -- spdk/autotest.sh@315 -- # '[' 0 -eq 1 ']' 00:17:08.114 05:07:26 -- spdk/autotest.sh@319 -- # '[' 0 -eq 1 ']' 00:17:08.114 05:07:26 -- spdk/autotest.sh@324 -- # '[' 0 -eq 1 ']' 00:17:08.114 05:07:26 -- spdk/autotest.sh@333 -- # '[' 0 -eq 1 ']' 00:17:08.114 05:07:26 -- spdk/autotest.sh@338 -- # '[' 0 -eq 1 ']' 00:17:08.114 05:07:26 -- spdk/autotest.sh@342 -- # '[' 1 -eq 1 ']' 00:17:08.114 05:07:26 -- spdk/autotest.sh@343 -- # run_test ftl /home/vagrant/spdk_repo/spdk/test/ftl/ftl.sh 00:17:08.114 05:07:26 -- common/autotest_common.sh@1105 -- # '[' 2 -le 1 ']' 00:17:08.114 05:07:26 -- common/autotest_common.sh@1111 -- # xtrace_disable 00:17:08.114 05:07:26 -- common/autotest_common.sh@10 -- # set +x 00:17:08.114 ************************************ 00:17:08.114 START TEST ftl 00:17:08.114 ************************************ 00:17:08.114 05:07:26 ftl -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/ftl/ftl.sh 00:17:08.114 * Looking for test storage... 00:17:08.114 * Found test storage at /home/vagrant/spdk_repo/spdk/test/ftl 00:17:08.114 05:07:26 ftl -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:17:08.114 05:07:26 ftl -- common/autotest_common.sh@1711 -- # lcov --version 00:17:08.114 05:07:26 ftl -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:17:08.114 05:07:26 ftl -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:17:08.114 05:07:26 ftl -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:17:08.114 05:07:26 ftl -- scripts/common.sh@333 -- # local ver1 ver1_l 00:17:08.114 05:07:26 ftl -- scripts/common.sh@334 -- # local ver2 ver2_l 00:17:08.114 05:07:26 ftl -- scripts/common.sh@336 -- # IFS=.-: 00:17:08.114 05:07:26 ftl -- scripts/common.sh@336 -- # read -ra ver1 00:17:08.114 05:07:26 ftl -- scripts/common.sh@337 -- # IFS=.-: 00:17:08.114 05:07:26 ftl -- scripts/common.sh@337 -- # read -ra ver2 00:17:08.114 05:07:26 ftl -- scripts/common.sh@338 -- # local 'op=<' 00:17:08.114 05:07:26 ftl -- scripts/common.sh@340 -- # ver1_l=2 00:17:08.114 05:07:26 ftl -- scripts/common.sh@341 -- # ver2_l=1 00:17:08.114 05:07:26 ftl -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:17:08.114 05:07:26 ftl -- scripts/common.sh@344 -- # case "$op" in 00:17:08.114 05:07:26 ftl -- scripts/common.sh@345 -- # : 1 00:17:08.114 05:07:26 ftl -- scripts/common.sh@364 -- # (( v = 0 )) 00:17:08.114 05:07:26 ftl -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:17:08.114 05:07:26 ftl -- scripts/common.sh@365 -- # decimal 1 00:17:08.114 05:07:26 ftl -- scripts/common.sh@353 -- # local d=1 00:17:08.114 05:07:26 ftl -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:17:08.114 05:07:26 ftl -- scripts/common.sh@355 -- # echo 1 00:17:08.114 05:07:26 ftl -- scripts/common.sh@365 -- # ver1[v]=1 00:17:08.114 05:07:26 ftl -- scripts/common.sh@366 -- # decimal 2 00:17:08.114 05:07:26 ftl -- scripts/common.sh@353 -- # local d=2 00:17:08.114 05:07:26 ftl -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:17:08.114 05:07:26 ftl -- scripts/common.sh@355 -- # echo 2 00:17:08.114 05:07:26 ftl -- scripts/common.sh@366 -- # ver2[v]=2 00:17:08.114 05:07:26 ftl -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:17:08.114 05:07:26 ftl -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:17:08.114 05:07:26 ftl -- scripts/common.sh@368 -- # return 0 00:17:08.114 05:07:26 ftl -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:17:08.114 05:07:26 ftl -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:17:08.114 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:08.114 --rc genhtml_branch_coverage=1 00:17:08.114 --rc genhtml_function_coverage=1 00:17:08.114 --rc genhtml_legend=1 00:17:08.114 --rc geninfo_all_blocks=1 00:17:08.114 --rc geninfo_unexecuted_blocks=1 00:17:08.114 00:17:08.114 ' 00:17:08.114 05:07:26 ftl -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:17:08.114 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:08.114 --rc genhtml_branch_coverage=1 00:17:08.114 --rc genhtml_function_coverage=1 00:17:08.114 --rc genhtml_legend=1 00:17:08.114 --rc geninfo_all_blocks=1 00:17:08.114 --rc geninfo_unexecuted_blocks=1 00:17:08.114 00:17:08.114 ' 00:17:08.114 05:07:26 ftl -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:17:08.114 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:08.114 --rc genhtml_branch_coverage=1 00:17:08.114 --rc genhtml_function_coverage=1 00:17:08.114 --rc genhtml_legend=1 00:17:08.114 --rc geninfo_all_blocks=1 00:17:08.114 --rc geninfo_unexecuted_blocks=1 00:17:08.114 00:17:08.114 ' 00:17:08.114 05:07:26 ftl -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:17:08.114 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:08.114 --rc genhtml_branch_coverage=1 00:17:08.114 --rc genhtml_function_coverage=1 00:17:08.114 --rc genhtml_legend=1 00:17:08.114 --rc geninfo_all_blocks=1 00:17:08.114 --rc geninfo_unexecuted_blocks=1 00:17:08.114 00:17:08.114 ' 00:17:08.114 05:07:26 ftl -- ftl/ftl.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/ftl/common.sh 00:17:08.114 05:07:26 ftl -- ftl/common.sh@8 -- # dirname /home/vagrant/spdk_repo/spdk/test/ftl/ftl.sh 00:17:08.114 05:07:26 ftl -- ftl/common.sh@8 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl 00:17:08.114 05:07:26 ftl -- ftl/common.sh@8 -- # testdir=/home/vagrant/spdk_repo/spdk/test/ftl 00:17:08.114 05:07:26 ftl -- ftl/common.sh@9 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl/../.. 00:17:08.114 05:07:26 ftl -- ftl/common.sh@9 -- # rootdir=/home/vagrant/spdk_repo/spdk 00:17:08.114 05:07:26 ftl -- ftl/common.sh@10 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:17:08.114 05:07:26 ftl -- ftl/common.sh@12 -- # export 'ftl_tgt_core_mask=[0]' 00:17:08.115 05:07:26 ftl -- ftl/common.sh@12 -- # ftl_tgt_core_mask='[0]' 00:17:08.115 05:07:26 ftl -- ftl/common.sh@14 -- # export spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:08.115 05:07:26 ftl -- ftl/common.sh@14 -- # spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:08.115 05:07:26 ftl -- ftl/common.sh@15 -- # export 'spdk_tgt_cpumask=[0]' 00:17:08.115 05:07:26 ftl -- ftl/common.sh@15 -- # spdk_tgt_cpumask='[0]' 00:17:08.115 05:07:26 ftl -- ftl/common.sh@16 -- # export spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:17:08.115 05:07:26 ftl -- ftl/common.sh@16 -- # spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:17:08.115 05:07:26 ftl -- ftl/common.sh@17 -- # export spdk_tgt_pid= 00:17:08.115 05:07:26 ftl -- ftl/common.sh@17 -- # spdk_tgt_pid= 00:17:08.115 05:07:26 ftl -- ftl/common.sh@19 -- # export spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:08.115 05:07:26 ftl -- ftl/common.sh@19 -- # spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:08.115 05:07:26 ftl -- ftl/common.sh@20 -- # export 'spdk_ini_cpumask=[1]' 00:17:08.115 05:07:26 ftl -- ftl/common.sh@20 -- # spdk_ini_cpumask='[1]' 00:17:08.115 05:07:26 ftl -- ftl/common.sh@21 -- # export spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:17:08.115 05:07:26 ftl -- ftl/common.sh@21 -- # spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:17:08.115 05:07:26 ftl -- ftl/common.sh@22 -- # export spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:17:08.115 05:07:26 ftl -- ftl/common.sh@22 -- # spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:17:08.115 05:07:26 ftl -- ftl/common.sh@23 -- # export spdk_ini_pid= 00:17:08.115 05:07:26 ftl -- ftl/common.sh@23 -- # spdk_ini_pid= 00:17:08.115 05:07:26 ftl -- ftl/common.sh@25 -- # export spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:17:08.115 05:07:26 ftl -- ftl/common.sh@25 -- # spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:17:08.115 05:07:26 ftl -- ftl/ftl.sh@11 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:17:08.115 05:07:26 ftl -- ftl/ftl.sh@31 -- # trap at_ftl_exit SIGINT SIGTERM EXIT 00:17:08.115 05:07:26 ftl -- ftl/ftl.sh@34 -- # PCI_ALLOWED= 00:17:08.115 05:07:26 ftl -- ftl/ftl.sh@34 -- # PCI_BLOCKED= 00:17:08.115 05:07:26 ftl -- ftl/ftl.sh@34 -- # DRIVER_OVERRIDE= 00:17:08.115 05:07:26 ftl -- ftl/ftl.sh@34 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh 00:17:08.115 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:17:08.115 0000:00:11.0 (1b36 0010): Already using the uio_pci_generic driver 00:17:08.115 0000:00:12.0 (1b36 0010): Already using the uio_pci_generic driver 00:17:08.115 0000:00:10.0 (1b36 0010): Already using the uio_pci_generic driver 00:17:08.115 0000:00:13.0 (1b36 0010): Already using the uio_pci_generic driver 00:17:08.115 05:07:26 ftl -- ftl/ftl.sh@37 -- # spdk_tgt_pid=88089 00:17:08.115 05:07:26 ftl -- ftl/ftl.sh@36 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt --wait-for-rpc 00:17:08.115 05:07:26 ftl -- ftl/ftl.sh@38 -- # waitforlisten 88089 00:17:08.115 05:07:26 ftl -- common/autotest_common.sh@835 -- # '[' -z 88089 ']' 00:17:08.115 05:07:26 ftl -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:17:08.115 05:07:26 ftl -- common/autotest_common.sh@840 -- # local max_retries=100 00:17:08.115 05:07:26 ftl -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:17:08.115 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:17:08.115 05:07:26 ftl -- common/autotest_common.sh@844 -- # xtrace_disable 00:17:08.115 05:07:26 ftl -- common/autotest_common.sh@10 -- # set +x 00:17:08.115 [2024-12-15 05:07:26.834080] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:17:08.115 [2024-12-15 05:07:26.834200] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid88089 ] 00:17:08.115 [2024-12-15 05:07:26.985170] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:08.115 [2024-12-15 05:07:27.004751] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:17:08.115 05:07:27 ftl -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:17:08.115 05:07:27 ftl -- common/autotest_common.sh@868 -- # return 0 00:17:08.115 05:07:27 ftl -- ftl/ftl.sh@40 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_set_options -d 00:17:08.115 05:07:27 ftl -- ftl/ftl.sh@41 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py framework_start_init 00:17:08.409 05:07:28 ftl -- ftl/ftl.sh@43 -- # /home/vagrant/spdk_repo/spdk/scripts/gen_nvme.sh 00:17:08.409 05:07:28 ftl -- ftl/ftl.sh@43 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py load_subsystem_config -j /dev/fd/62 00:17:08.673 05:07:28 ftl -- ftl/ftl.sh@46 -- # cache_size=1310720 00:17:08.673 05:07:28 ftl -- ftl/ftl.sh@47 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs 00:17:08.673 05:07:28 ftl -- ftl/ftl.sh@47 -- # jq -r '.[] | select(.md_size==64 and .zoned == false and .num_blocks >= 1310720).driver_specific.nvme[].pci_address' 00:17:08.933 05:07:28 ftl -- ftl/ftl.sh@47 -- # cache_disks=0000:00:10.0 00:17:08.933 05:07:28 ftl -- ftl/ftl.sh@48 -- # for disk in $cache_disks 00:17:08.933 05:07:28 ftl -- ftl/ftl.sh@49 -- # nv_cache=0000:00:10.0 00:17:08.933 05:07:28 ftl -- ftl/ftl.sh@50 -- # break 00:17:08.933 05:07:28 ftl -- ftl/ftl.sh@53 -- # '[' -z 0000:00:10.0 ']' 00:17:08.933 05:07:28 ftl -- ftl/ftl.sh@59 -- # base_size=1310720 00:17:08.933 05:07:28 ftl -- ftl/ftl.sh@60 -- # jq -r '.[] | select(.driver_specific.nvme[0].pci_address!="0000:00:10.0" and .zoned == false and .num_blocks >= 1310720).driver_specific.nvme[].pci_address' 00:17:08.933 05:07:28 ftl -- ftl/ftl.sh@60 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs 00:17:09.193 05:07:29 ftl -- ftl/ftl.sh@60 -- # base_disks=0000:00:11.0 00:17:09.193 05:07:29 ftl -- ftl/ftl.sh@61 -- # for disk in $base_disks 00:17:09.193 05:07:29 ftl -- ftl/ftl.sh@62 -- # device=0000:00:11.0 00:17:09.193 05:07:29 ftl -- ftl/ftl.sh@63 -- # break 00:17:09.193 05:07:29 ftl -- ftl/ftl.sh@66 -- # killprocess 88089 00:17:09.193 05:07:29 ftl -- common/autotest_common.sh@954 -- # '[' -z 88089 ']' 00:17:09.193 05:07:29 ftl -- common/autotest_common.sh@958 -- # kill -0 88089 00:17:09.193 05:07:29 ftl -- common/autotest_common.sh@959 -- # uname 00:17:09.193 05:07:29 ftl -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:17:09.193 05:07:29 ftl -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 88089 00:17:09.193 05:07:29 ftl -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:17:09.194 killing process with pid 88089 00:17:09.194 05:07:29 ftl -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:17:09.194 05:07:29 ftl -- common/autotest_common.sh@972 -- # echo 'killing process with pid 88089' 00:17:09.194 05:07:29 ftl -- common/autotest_common.sh@973 -- # kill 88089 00:17:09.194 05:07:29 ftl -- common/autotest_common.sh@978 -- # wait 88089 00:17:09.455 05:07:29 ftl -- ftl/ftl.sh@68 -- # '[' -z 0000:00:11.0 ']' 00:17:09.455 05:07:29 ftl -- ftl/ftl.sh@73 -- # run_test ftl_fio_basic /home/vagrant/spdk_repo/spdk/test/ftl/fio.sh 0000:00:11.0 0000:00:10.0 basic 00:17:09.455 05:07:29 ftl -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:17:09.455 05:07:29 ftl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:17:09.455 05:07:29 ftl -- common/autotest_common.sh@10 -- # set +x 00:17:09.455 ************************************ 00:17:09.455 START TEST ftl_fio_basic 00:17:09.455 ************************************ 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/ftl/fio.sh 0000:00:11.0 0000:00:10.0 basic 00:17:09.455 * Looking for test storage... 00:17:09.455 * Found test storage at /home/vagrant/spdk_repo/spdk/test/ftl 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@1711 -- # lcov --version 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@333 -- # local ver1 ver1_l 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@334 -- # local ver2 ver2_l 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@336 -- # IFS=.-: 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@336 -- # read -ra ver1 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@337 -- # IFS=.-: 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@337 -- # read -ra ver2 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@338 -- # local 'op=<' 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@340 -- # ver1_l=2 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@341 -- # ver2_l=1 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@344 -- # case "$op" in 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@345 -- # : 1 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@364 -- # (( v = 0 )) 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@365 -- # decimal 1 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@353 -- # local d=1 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@355 -- # echo 1 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@365 -- # ver1[v]=1 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@366 -- # decimal 2 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@353 -- # local d=2 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@355 -- # echo 2 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@366 -- # ver2[v]=2 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- scripts/common.sh@368 -- # return 0 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:17:09.455 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:09.455 --rc genhtml_branch_coverage=1 00:17:09.455 --rc genhtml_function_coverage=1 00:17:09.455 --rc genhtml_legend=1 00:17:09.455 --rc geninfo_all_blocks=1 00:17:09.455 --rc geninfo_unexecuted_blocks=1 00:17:09.455 00:17:09.455 ' 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:17:09.455 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:09.455 --rc genhtml_branch_coverage=1 00:17:09.455 --rc genhtml_function_coverage=1 00:17:09.455 --rc genhtml_legend=1 00:17:09.455 --rc geninfo_all_blocks=1 00:17:09.455 --rc geninfo_unexecuted_blocks=1 00:17:09.455 00:17:09.455 ' 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:17:09.455 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:09.455 --rc genhtml_branch_coverage=1 00:17:09.455 --rc genhtml_function_coverage=1 00:17:09.455 --rc genhtml_legend=1 00:17:09.455 --rc geninfo_all_blocks=1 00:17:09.455 --rc geninfo_unexecuted_blocks=1 00:17:09.455 00:17:09.455 ' 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:17:09.455 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:17:09.455 --rc genhtml_branch_coverage=1 00:17:09.455 --rc genhtml_function_coverage=1 00:17:09.455 --rc genhtml_legend=1 00:17:09.455 --rc geninfo_all_blocks=1 00:17:09.455 --rc geninfo_unexecuted_blocks=1 00:17:09.455 00:17:09.455 ' 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/ftl/common.sh 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@8 -- # dirname /home/vagrant/spdk_repo/spdk/test/ftl/fio.sh 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@8 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@8 -- # testdir=/home/vagrant/spdk_repo/spdk/test/ftl 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@9 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl/../.. 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@9 -- # rootdir=/home/vagrant/spdk_repo/spdk 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@10 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@12 -- # export 'ftl_tgt_core_mask=[0]' 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@12 -- # ftl_tgt_core_mask='[0]' 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@14 -- # export spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@14 -- # spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@15 -- # export 'spdk_tgt_cpumask=[0]' 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@15 -- # spdk_tgt_cpumask='[0]' 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@16 -- # export spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@16 -- # spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@17 -- # export spdk_tgt_pid= 00:17:09.455 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@17 -- # spdk_tgt_pid= 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@19 -- # export spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@19 -- # spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@20 -- # export 'spdk_ini_cpumask=[1]' 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@20 -- # spdk_ini_cpumask='[1]' 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@21 -- # export spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@21 -- # spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@22 -- # export spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@22 -- # spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@23 -- # export spdk_ini_pid= 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@23 -- # spdk_ini_pid= 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@25 -- # export spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/common.sh@25 -- # spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@11 -- # declare -A suite 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@12 -- # suite['basic']='randw-verify randw-verify-j2 randw-verify-depth128' 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@13 -- # suite['extended']='drive-prep randw-verify-qd128-ext randw-verify-qd2048-ext randw randr randrw unmap' 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@14 -- # suite['nightly']='drive-prep randw-verify-qd256-nght randw-verify-qd256-nght randw-verify-qd256-nght' 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@16 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@23 -- # device=0000:00:11.0 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@24 -- # cache_device=0000:00:10.0 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@25 -- # tests='randw-verify randw-verify-j2 randw-verify-depth128' 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@26 -- # uuid= 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@27 -- # timeout=240 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@29 -- # [[ y != y ]] 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@34 -- # '[' -z 'randw-verify randw-verify-j2 randw-verify-depth128' ']' 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@39 -- # export FTL_BDEV_NAME=ftl0 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@39 -- # FTL_BDEV_NAME=ftl0 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@40 -- # export FTL_JSON_CONF=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@40 -- # FTL_JSON_CONF=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@42 -- # trap 'fio_kill; exit 1' SIGINT SIGTERM EXIT 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@45 -- # svcpid=88199 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@46 -- # waitforlisten 88199 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@835 -- # '[' -z 88199 ']' 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- ftl/fio.sh@44 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 7 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:17:09.456 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@840 -- # local max_retries=100 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@844 -- # xtrace_disable 00:17:09.456 05:07:29 ftl.ftl_fio_basic -- common/autotest_common.sh@10 -- # set +x 00:17:09.716 [2024-12-15 05:07:29.603613] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:17:09.716 [2024-12-15 05:07:29.603720] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid88199 ] 00:17:09.716 [2024-12-15 05:07:29.761161] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:17:09.716 [2024-12-15 05:07:29.779677] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:17:09.716 [2024-12-15 05:07:29.779844] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:17:09.716 [2024-12-15 05:07:29.779880] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 2 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- common/autotest_common.sh@868 -- # return 0 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- ftl/fio.sh@48 -- # create_base_bdev nvme0 0000:00:11.0 103424 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- ftl/common.sh@54 -- # local name=nvme0 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- ftl/common.sh@55 -- # local base_bdf=0000:00:11.0 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- ftl/common.sh@56 -- # local size=103424 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- ftl/common.sh@59 -- # local base_bdev 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- ftl/common.sh@60 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t PCIe -a 0000:00:11.0 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- ftl/common.sh@60 -- # base_bdev=nvme0n1 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- ftl/common.sh@62 -- # local base_size 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- ftl/common.sh@63 -- # get_bdev_size nvme0n1 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- common/autotest_common.sh@1382 -- # local bdev_name=nvme0n1 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- common/autotest_common.sh@1383 -- # local bdev_info 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- common/autotest_common.sh@1384 -- # local bs 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- common/autotest_common.sh@1385 -- # local nb 00:17:10.660 05:07:30 ftl.ftl_fio_basic -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b nvme0n1 00:17:10.921 05:07:30 ftl.ftl_fio_basic -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:17:10.921 { 00:17:10.921 "name": "nvme0n1", 00:17:10.921 "aliases": [ 00:17:10.921 "53366d5f-6605-4682-9c0a-85750ad9c44d" 00:17:10.921 ], 00:17:10.921 "product_name": "NVMe disk", 00:17:10.921 "block_size": 4096, 00:17:10.921 "num_blocks": 1310720, 00:17:10.921 "uuid": "53366d5f-6605-4682-9c0a-85750ad9c44d", 00:17:10.921 "numa_id": -1, 00:17:10.921 "assigned_rate_limits": { 00:17:10.921 "rw_ios_per_sec": 0, 00:17:10.921 "rw_mbytes_per_sec": 0, 00:17:10.921 "r_mbytes_per_sec": 0, 00:17:10.921 "w_mbytes_per_sec": 0 00:17:10.921 }, 00:17:10.921 "claimed": false, 00:17:10.921 "zoned": false, 00:17:10.921 "supported_io_types": { 00:17:10.921 "read": true, 00:17:10.921 "write": true, 00:17:10.921 "unmap": true, 00:17:10.921 "flush": true, 00:17:10.921 "reset": true, 00:17:10.921 "nvme_admin": true, 00:17:10.921 "nvme_io": true, 00:17:10.921 "nvme_io_md": false, 00:17:10.921 "write_zeroes": true, 00:17:10.921 "zcopy": false, 00:17:10.921 "get_zone_info": false, 00:17:10.921 "zone_management": false, 00:17:10.921 "zone_append": false, 00:17:10.921 "compare": true, 00:17:10.921 "compare_and_write": false, 00:17:10.921 "abort": true, 00:17:10.921 "seek_hole": false, 00:17:10.921 "seek_data": false, 00:17:10.921 "copy": true, 00:17:10.921 "nvme_iov_md": false 00:17:10.921 }, 00:17:10.921 "driver_specific": { 00:17:10.921 "nvme": [ 00:17:10.921 { 00:17:10.921 "pci_address": "0000:00:11.0", 00:17:10.921 "trid": { 00:17:10.921 "trtype": "PCIe", 00:17:10.921 "traddr": "0000:00:11.0" 00:17:10.921 }, 00:17:10.921 "ctrlr_data": { 00:17:10.921 "cntlid": 0, 00:17:10.921 "vendor_id": "0x1b36", 00:17:10.921 "model_number": "QEMU NVMe Ctrl", 00:17:10.922 "serial_number": "12341", 00:17:10.922 "firmware_revision": "8.0.0", 00:17:10.922 "subnqn": "nqn.2019-08.org.qemu:12341", 00:17:10.922 "oacs": { 00:17:10.922 "security": 0, 00:17:10.922 "format": 1, 00:17:10.922 "firmware": 0, 00:17:10.922 "ns_manage": 1 00:17:10.922 }, 00:17:10.922 "multi_ctrlr": false, 00:17:10.922 "ana_reporting": false 00:17:10.922 }, 00:17:10.922 "vs": { 00:17:10.922 "nvme_version": "1.4" 00:17:10.922 }, 00:17:10.922 "ns_data": { 00:17:10.922 "id": 1, 00:17:10.922 "can_share": false 00:17:10.922 } 00:17:10.922 } 00:17:10.922 ], 00:17:10.922 "mp_policy": "active_passive" 00:17:10.922 } 00:17:10.922 } 00:17:10.922 ]' 00:17:10.922 05:07:30 ftl.ftl_fio_basic -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:17:10.922 05:07:30 ftl.ftl_fio_basic -- common/autotest_common.sh@1387 -- # bs=4096 00:17:10.922 05:07:30 ftl.ftl_fio_basic -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:17:10.922 05:07:30 ftl.ftl_fio_basic -- common/autotest_common.sh@1388 -- # nb=1310720 00:17:10.922 05:07:30 ftl.ftl_fio_basic -- common/autotest_common.sh@1391 -- # bdev_size=5120 00:17:10.922 05:07:30 ftl.ftl_fio_basic -- common/autotest_common.sh@1392 -- # echo 5120 00:17:10.922 05:07:30 ftl.ftl_fio_basic -- ftl/common.sh@63 -- # base_size=5120 00:17:10.922 05:07:30 ftl.ftl_fio_basic -- ftl/common.sh@64 -- # [[ 103424 -le 5120 ]] 00:17:10.922 05:07:30 ftl.ftl_fio_basic -- ftl/common.sh@67 -- # clear_lvols 00:17:10.922 05:07:30 ftl.ftl_fio_basic -- ftl/common.sh@28 -- # jq -r '.[] | .uuid' 00:17:10.922 05:07:30 ftl.ftl_fio_basic -- ftl/common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_get_lvstores 00:17:11.183 05:07:31 ftl.ftl_fio_basic -- ftl/common.sh@28 -- # stores= 00:17:11.183 05:07:31 ftl.ftl_fio_basic -- ftl/common.sh@68 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_create_lvstore nvme0n1 lvs 00:17:11.183 05:07:31 ftl.ftl_fio_basic -- ftl/common.sh@68 -- # lvs=ee30c46d-8cc9-4635-81c9-3e11a9c843e5 00:17:11.183 05:07:31 ftl.ftl_fio_basic -- ftl/common.sh@69 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_create nvme0n1p0 103424 -t -u ee30c46d-8cc9-4635-81c9-3e11a9c843e5 00:17:11.444 05:07:31 ftl.ftl_fio_basic -- ftl/fio.sh@48 -- # split_bdev=27e3433d-098c-4889-8b4e-4b95b76fb9fb 00:17:11.444 05:07:31 ftl.ftl_fio_basic -- ftl/fio.sh@49 -- # create_nv_cache_bdev nvc0 0000:00:10.0 27e3433d-098c-4889-8b4e-4b95b76fb9fb 00:17:11.444 05:07:31 ftl.ftl_fio_basic -- ftl/common.sh@35 -- # local name=nvc0 00:17:11.444 05:07:31 ftl.ftl_fio_basic -- ftl/common.sh@36 -- # local cache_bdf=0000:00:10.0 00:17:11.444 05:07:31 ftl.ftl_fio_basic -- ftl/common.sh@37 -- # local base_bdev=27e3433d-098c-4889-8b4e-4b95b76fb9fb 00:17:11.444 05:07:31 ftl.ftl_fio_basic -- ftl/common.sh@38 -- # local cache_size= 00:17:11.444 05:07:31 ftl.ftl_fio_basic -- ftl/common.sh@41 -- # get_bdev_size 27e3433d-098c-4889-8b4e-4b95b76fb9fb 00:17:11.444 05:07:31 ftl.ftl_fio_basic -- common/autotest_common.sh@1382 -- # local bdev_name=27e3433d-098c-4889-8b4e-4b95b76fb9fb 00:17:11.444 05:07:31 ftl.ftl_fio_basic -- common/autotest_common.sh@1383 -- # local bdev_info 00:17:11.444 05:07:31 ftl.ftl_fio_basic -- common/autotest_common.sh@1384 -- # local bs 00:17:11.444 05:07:31 ftl.ftl_fio_basic -- common/autotest_common.sh@1385 -- # local nb 00:17:11.444 05:07:31 ftl.ftl_fio_basic -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b 27e3433d-098c-4889-8b4e-4b95b76fb9fb 00:17:11.704 05:07:31 ftl.ftl_fio_basic -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:17:11.704 { 00:17:11.704 "name": "27e3433d-098c-4889-8b4e-4b95b76fb9fb", 00:17:11.704 "aliases": [ 00:17:11.704 "lvs/nvme0n1p0" 00:17:11.704 ], 00:17:11.704 "product_name": "Logical Volume", 00:17:11.704 "block_size": 4096, 00:17:11.704 "num_blocks": 26476544, 00:17:11.704 "uuid": "27e3433d-098c-4889-8b4e-4b95b76fb9fb", 00:17:11.704 "assigned_rate_limits": { 00:17:11.704 "rw_ios_per_sec": 0, 00:17:11.704 "rw_mbytes_per_sec": 0, 00:17:11.704 "r_mbytes_per_sec": 0, 00:17:11.704 "w_mbytes_per_sec": 0 00:17:11.704 }, 00:17:11.704 "claimed": false, 00:17:11.704 "zoned": false, 00:17:11.704 "supported_io_types": { 00:17:11.704 "read": true, 00:17:11.704 "write": true, 00:17:11.704 "unmap": true, 00:17:11.704 "flush": false, 00:17:11.704 "reset": true, 00:17:11.704 "nvme_admin": false, 00:17:11.704 "nvme_io": false, 00:17:11.704 "nvme_io_md": false, 00:17:11.704 "write_zeroes": true, 00:17:11.704 "zcopy": false, 00:17:11.704 "get_zone_info": false, 00:17:11.704 "zone_management": false, 00:17:11.704 "zone_append": false, 00:17:11.704 "compare": false, 00:17:11.704 "compare_and_write": false, 00:17:11.704 "abort": false, 00:17:11.704 "seek_hole": true, 00:17:11.704 "seek_data": true, 00:17:11.704 "copy": false, 00:17:11.704 "nvme_iov_md": false 00:17:11.704 }, 00:17:11.704 "driver_specific": { 00:17:11.704 "lvol": { 00:17:11.704 "lvol_store_uuid": "ee30c46d-8cc9-4635-81c9-3e11a9c843e5", 00:17:11.704 "base_bdev": "nvme0n1", 00:17:11.704 "thin_provision": true, 00:17:11.704 "num_allocated_clusters": 0, 00:17:11.704 "snapshot": false, 00:17:11.704 "clone": false, 00:17:11.704 "esnap_clone": false 00:17:11.704 } 00:17:11.704 } 00:17:11.704 } 00:17:11.704 ]' 00:17:11.704 05:07:31 ftl.ftl_fio_basic -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:17:11.704 05:07:31 ftl.ftl_fio_basic -- common/autotest_common.sh@1387 -- # bs=4096 00:17:11.704 05:07:31 ftl.ftl_fio_basic -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:17:11.704 05:07:31 ftl.ftl_fio_basic -- common/autotest_common.sh@1388 -- # nb=26476544 00:17:11.704 05:07:31 ftl.ftl_fio_basic -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:17:11.704 05:07:31 ftl.ftl_fio_basic -- common/autotest_common.sh@1392 -- # echo 103424 00:17:11.704 05:07:31 ftl.ftl_fio_basic -- ftl/common.sh@41 -- # local base_size=5171 00:17:11.704 05:07:31 ftl.ftl_fio_basic -- ftl/common.sh@44 -- # local nvc_bdev 00:17:11.704 05:07:31 ftl.ftl_fio_basic -- ftl/common.sh@45 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvc0 -t PCIe -a 0000:00:10.0 00:17:11.965 05:07:32 ftl.ftl_fio_basic -- ftl/common.sh@45 -- # nvc_bdev=nvc0n1 00:17:11.965 05:07:32 ftl.ftl_fio_basic -- ftl/common.sh@47 -- # [[ -z '' ]] 00:17:11.965 05:07:32 ftl.ftl_fio_basic -- ftl/common.sh@48 -- # get_bdev_size 27e3433d-098c-4889-8b4e-4b95b76fb9fb 00:17:11.965 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1382 -- # local bdev_name=27e3433d-098c-4889-8b4e-4b95b76fb9fb 00:17:11.965 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1383 -- # local bdev_info 00:17:11.965 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1384 -- # local bs 00:17:11.965 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1385 -- # local nb 00:17:11.965 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b 27e3433d-098c-4889-8b4e-4b95b76fb9fb 00:17:12.226 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:17:12.226 { 00:17:12.226 "name": "27e3433d-098c-4889-8b4e-4b95b76fb9fb", 00:17:12.226 "aliases": [ 00:17:12.226 "lvs/nvme0n1p0" 00:17:12.226 ], 00:17:12.226 "product_name": "Logical Volume", 00:17:12.226 "block_size": 4096, 00:17:12.226 "num_blocks": 26476544, 00:17:12.226 "uuid": "27e3433d-098c-4889-8b4e-4b95b76fb9fb", 00:17:12.226 "assigned_rate_limits": { 00:17:12.226 "rw_ios_per_sec": 0, 00:17:12.226 "rw_mbytes_per_sec": 0, 00:17:12.226 "r_mbytes_per_sec": 0, 00:17:12.226 "w_mbytes_per_sec": 0 00:17:12.226 }, 00:17:12.226 "claimed": false, 00:17:12.226 "zoned": false, 00:17:12.226 "supported_io_types": { 00:17:12.226 "read": true, 00:17:12.226 "write": true, 00:17:12.226 "unmap": true, 00:17:12.226 "flush": false, 00:17:12.226 "reset": true, 00:17:12.226 "nvme_admin": false, 00:17:12.226 "nvme_io": false, 00:17:12.226 "nvme_io_md": false, 00:17:12.226 "write_zeroes": true, 00:17:12.226 "zcopy": false, 00:17:12.226 "get_zone_info": false, 00:17:12.226 "zone_management": false, 00:17:12.226 "zone_append": false, 00:17:12.226 "compare": false, 00:17:12.226 "compare_and_write": false, 00:17:12.226 "abort": false, 00:17:12.226 "seek_hole": true, 00:17:12.226 "seek_data": true, 00:17:12.226 "copy": false, 00:17:12.226 "nvme_iov_md": false 00:17:12.226 }, 00:17:12.226 "driver_specific": { 00:17:12.226 "lvol": { 00:17:12.226 "lvol_store_uuid": "ee30c46d-8cc9-4635-81c9-3e11a9c843e5", 00:17:12.226 "base_bdev": "nvme0n1", 00:17:12.226 "thin_provision": true, 00:17:12.226 "num_allocated_clusters": 0, 00:17:12.226 "snapshot": false, 00:17:12.226 "clone": false, 00:17:12.226 "esnap_clone": false 00:17:12.226 } 00:17:12.226 } 00:17:12.226 } 00:17:12.226 ]' 00:17:12.226 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:17:12.226 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1387 -- # bs=4096 00:17:12.226 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:17:12.226 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1388 -- # nb=26476544 00:17:12.226 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:17:12.226 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1392 -- # echo 103424 00:17:12.226 05:07:32 ftl.ftl_fio_basic -- ftl/common.sh@48 -- # cache_size=5171 00:17:12.226 05:07:32 ftl.ftl_fio_basic -- ftl/common.sh@50 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_split_create nvc0n1 -s 5171 1 00:17:12.487 05:07:32 ftl.ftl_fio_basic -- ftl/fio.sh@49 -- # nv_cache=nvc0n1p0 00:17:12.487 05:07:32 ftl.ftl_fio_basic -- ftl/fio.sh@51 -- # l2p_percentage=60 00:17:12.487 05:07:32 ftl.ftl_fio_basic -- ftl/fio.sh@52 -- # '[' -eq 1 ']' 00:17:12.487 /home/vagrant/spdk_repo/spdk/test/ftl/fio.sh: line 52: [: -eq: unary operator expected 00:17:12.487 05:07:32 ftl.ftl_fio_basic -- ftl/fio.sh@56 -- # get_bdev_size 27e3433d-098c-4889-8b4e-4b95b76fb9fb 00:17:12.487 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1382 -- # local bdev_name=27e3433d-098c-4889-8b4e-4b95b76fb9fb 00:17:12.487 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1383 -- # local bdev_info 00:17:12.487 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1384 -- # local bs 00:17:12.487 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1385 -- # local nb 00:17:12.487 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b 27e3433d-098c-4889-8b4e-4b95b76fb9fb 00:17:12.748 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:17:12.748 { 00:17:12.748 "name": "27e3433d-098c-4889-8b4e-4b95b76fb9fb", 00:17:12.748 "aliases": [ 00:17:12.748 "lvs/nvme0n1p0" 00:17:12.748 ], 00:17:12.748 "product_name": "Logical Volume", 00:17:12.748 "block_size": 4096, 00:17:12.748 "num_blocks": 26476544, 00:17:12.748 "uuid": "27e3433d-098c-4889-8b4e-4b95b76fb9fb", 00:17:12.748 "assigned_rate_limits": { 00:17:12.748 "rw_ios_per_sec": 0, 00:17:12.748 "rw_mbytes_per_sec": 0, 00:17:12.748 "r_mbytes_per_sec": 0, 00:17:12.748 "w_mbytes_per_sec": 0 00:17:12.748 }, 00:17:12.748 "claimed": false, 00:17:12.748 "zoned": false, 00:17:12.748 "supported_io_types": { 00:17:12.748 "read": true, 00:17:12.748 "write": true, 00:17:12.748 "unmap": true, 00:17:12.748 "flush": false, 00:17:12.748 "reset": true, 00:17:12.748 "nvme_admin": false, 00:17:12.748 "nvme_io": false, 00:17:12.748 "nvme_io_md": false, 00:17:12.748 "write_zeroes": true, 00:17:12.748 "zcopy": false, 00:17:12.748 "get_zone_info": false, 00:17:12.748 "zone_management": false, 00:17:12.748 "zone_append": false, 00:17:12.748 "compare": false, 00:17:12.748 "compare_and_write": false, 00:17:12.748 "abort": false, 00:17:12.748 "seek_hole": true, 00:17:12.748 "seek_data": true, 00:17:12.748 "copy": false, 00:17:12.748 "nvme_iov_md": false 00:17:12.748 }, 00:17:12.748 "driver_specific": { 00:17:12.748 "lvol": { 00:17:12.748 "lvol_store_uuid": "ee30c46d-8cc9-4635-81c9-3e11a9c843e5", 00:17:12.748 "base_bdev": "nvme0n1", 00:17:12.748 "thin_provision": true, 00:17:12.748 "num_allocated_clusters": 0, 00:17:12.748 "snapshot": false, 00:17:12.748 "clone": false, 00:17:12.748 "esnap_clone": false 00:17:12.748 } 00:17:12.748 } 00:17:12.748 } 00:17:12.748 ]' 00:17:12.748 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:17:12.748 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1387 -- # bs=4096 00:17:12.748 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:17:12.748 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1388 -- # nb=26476544 00:17:12.748 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:17:12.748 05:07:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1392 -- # echo 103424 00:17:12.748 05:07:32 ftl.ftl_fio_basic -- ftl/fio.sh@56 -- # l2p_dram_size_mb=60 00:17:12.748 05:07:32 ftl.ftl_fio_basic -- ftl/fio.sh@58 -- # '[' -z '' ']' 00:17:12.748 05:07:32 ftl.ftl_fio_basic -- ftl/fio.sh@60 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -t 240 bdev_ftl_create -b ftl0 -d 27e3433d-098c-4889-8b4e-4b95b76fb9fb -c nvc0n1p0 --l2p_dram_limit 60 00:17:13.010 [2024-12-15 05:07:32.908567] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:13.010 [2024-12-15 05:07:32.908605] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:17:13.010 [2024-12-15 05:07:32.908616] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:17:13.010 [2024-12-15 05:07:32.908623] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:13.010 [2024-12-15 05:07:32.908677] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:13.010 [2024-12-15 05:07:32.908686] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:17:13.010 [2024-12-15 05:07:32.908693] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.040 ms 00:17:13.010 [2024-12-15 05:07:32.908701] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:13.010 [2024-12-15 05:07:32.908744] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:17:13.010 [2024-12-15 05:07:32.908977] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:17:13.010 [2024-12-15 05:07:32.908996] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:13.010 [2024-12-15 05:07:32.909003] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:17:13.010 [2024-12-15 05:07:32.909010] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.275 ms 00:17:13.010 [2024-12-15 05:07:32.909017] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:13.010 [2024-12-15 05:07:32.909074] mngt/ftl_mngt_md.c: 570:ftl_mngt_superblock_init: *NOTICE*: [FTL][ftl0] Create new FTL, UUID c2c96026-3b10-460b-8866-2d7afa6cf328 00:17:13.010 [2024-12-15 05:07:32.910112] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:13.010 [2024-12-15 05:07:32.910210] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Default-initialize superblock 00:17:13.010 [2024-12-15 05:07:32.910234] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.020 ms 00:17:13.010 [2024-12-15 05:07:32.910241] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:13.010 [2024-12-15 05:07:32.915488] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:13.010 [2024-12-15 05:07:32.915512] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:17:13.010 [2024-12-15 05:07:32.915522] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.175 ms 00:17:13.010 [2024-12-15 05:07:32.915529] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:13.010 [2024-12-15 05:07:32.915611] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:13.010 [2024-12-15 05:07:32.915619] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:17:13.010 [2024-12-15 05:07:32.915627] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.057 ms 00:17:13.010 [2024-12-15 05:07:32.915642] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:13.010 [2024-12-15 05:07:32.915699] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:13.010 [2024-12-15 05:07:32.915707] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:17:13.010 [2024-12-15 05:07:32.915723] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.015 ms 00:17:13.010 [2024-12-15 05:07:32.915728] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:13.010 [2024-12-15 05:07:32.915760] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:17:13.010 [2024-12-15 05:07:32.917083] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:13.010 [2024-12-15 05:07:32.917179] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:17:13.010 [2024-12-15 05:07:32.917190] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.330 ms 00:17:13.010 [2024-12-15 05:07:32.917207] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:13.010 [2024-12-15 05:07:32.917244] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:13.010 [2024-12-15 05:07:32.917252] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:17:13.010 [2024-12-15 05:07:32.917259] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.010 ms 00:17:13.010 [2024-12-15 05:07:32.917267] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:13.010 [2024-12-15 05:07:32.917290] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 1 00:17:13.010 [2024-12-15 05:07:32.917408] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:17:13.010 [2024-12-15 05:07:32.917417] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:17:13.010 [2024-12-15 05:07:32.917448] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:17:13.010 [2024-12-15 05:07:32.917456] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:17:13.010 [2024-12-15 05:07:32.917465] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:17:13.010 [2024-12-15 05:07:32.917472] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:17:13.010 [2024-12-15 05:07:32.917478] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:17:13.010 [2024-12-15 05:07:32.917484] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:17:13.010 [2024-12-15 05:07:32.917491] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:17:13.010 [2024-12-15 05:07:32.917497] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:13.010 [2024-12-15 05:07:32.917504] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:17:13.010 [2024-12-15 05:07:32.917509] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.207 ms 00:17:13.010 [2024-12-15 05:07:32.917516] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:13.010 [2024-12-15 05:07:32.917588] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:13.010 [2024-12-15 05:07:32.917605] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:17:13.010 [2024-12-15 05:07:32.917613] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.053 ms 00:17:13.010 [2024-12-15 05:07:32.917620] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:13.010 [2024-12-15 05:07:32.917714] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:17:13.010 [2024-12-15 05:07:32.917723] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:17:13.010 [2024-12-15 05:07:32.917729] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:17:13.010 [2024-12-15 05:07:32.917738] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:17:13.010 [2024-12-15 05:07:32.917744] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:17:13.010 [2024-12-15 05:07:32.917751] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:17:13.010 [2024-12-15 05:07:32.917756] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:17:13.010 [2024-12-15 05:07:32.917762] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:17:13.010 [2024-12-15 05:07:32.917768] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:17:13.010 [2024-12-15 05:07:32.917775] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:17:13.010 [2024-12-15 05:07:32.917781] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:17:13.010 [2024-12-15 05:07:32.917788] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:17:13.011 [2024-12-15 05:07:32.917794] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:17:13.011 [2024-12-15 05:07:32.917803] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:17:13.011 [2024-12-15 05:07:32.917820] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:17:13.011 [2024-12-15 05:07:32.917827] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:17:13.011 [2024-12-15 05:07:32.917838] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:17:13.011 [2024-12-15 05:07:32.917845] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:17:13.011 [2024-12-15 05:07:32.917851] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:17:13.011 [2024-12-15 05:07:32.917858] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:17:13.011 [2024-12-15 05:07:32.917864] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:17:13.011 [2024-12-15 05:07:32.917871] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:17:13.011 [2024-12-15 05:07:32.917877] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:17:13.011 [2024-12-15 05:07:32.917885] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:17:13.011 [2024-12-15 05:07:32.917890] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:17:13.011 [2024-12-15 05:07:32.917898] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:17:13.011 [2024-12-15 05:07:32.917904] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:17:13.011 [2024-12-15 05:07:32.917911] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:17:13.011 [2024-12-15 05:07:32.917916] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:17:13.011 [2024-12-15 05:07:32.917927] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:17:13.011 [2024-12-15 05:07:32.917933] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:17:13.011 [2024-12-15 05:07:32.917941] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:17:13.011 [2024-12-15 05:07:32.917947] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:17:13.011 [2024-12-15 05:07:32.917953] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:17:13.011 [2024-12-15 05:07:32.917959] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:17:13.011 [2024-12-15 05:07:32.917966] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:17:13.011 [2024-12-15 05:07:32.917972] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:17:13.011 [2024-12-15 05:07:32.917979] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:17:13.011 [2024-12-15 05:07:32.917985] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:17:13.011 [2024-12-15 05:07:32.917992] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:17:13.011 [2024-12-15 05:07:32.917997] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:17:13.011 [2024-12-15 05:07:32.918005] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:17:13.011 [2024-12-15 05:07:32.918011] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:17:13.011 [2024-12-15 05:07:32.918018] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:17:13.011 [2024-12-15 05:07:32.918025] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:17:13.011 [2024-12-15 05:07:32.918043] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:17:13.011 [2024-12-15 05:07:32.918051] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:17:13.011 [2024-12-15 05:07:32.918059] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:17:13.011 [2024-12-15 05:07:32.918067] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:17:13.011 [2024-12-15 05:07:32.918074] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:17:13.011 [2024-12-15 05:07:32.918079] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:17:13.011 [2024-12-15 05:07:32.918087] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:17:13.011 [2024-12-15 05:07:32.918093] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:17:13.011 [2024-12-15 05:07:32.918101] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:17:13.011 [2024-12-15 05:07:32.918109] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:17:13.011 [2024-12-15 05:07:32.918119] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:17:13.011 [2024-12-15 05:07:32.918125] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:17:13.011 [2024-12-15 05:07:32.918132] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:17:13.011 [2024-12-15 05:07:32.918139] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:17:13.011 [2024-12-15 05:07:32.918146] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:17:13.011 [2024-12-15 05:07:32.918152] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:17:13.011 [2024-12-15 05:07:32.918160] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:17:13.011 [2024-12-15 05:07:32.918167] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:17:13.011 [2024-12-15 05:07:32.918173] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:17:13.011 [2024-12-15 05:07:32.918178] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:17:13.011 [2024-12-15 05:07:32.918184] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:17:13.011 [2024-12-15 05:07:32.918195] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:17:13.011 [2024-12-15 05:07:32.918202] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:17:13.011 [2024-12-15 05:07:32.918207] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:17:13.011 [2024-12-15 05:07:32.918214] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:17:13.011 [2024-12-15 05:07:32.918220] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:17:13.011 [2024-12-15 05:07:32.918227] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:17:13.011 [2024-12-15 05:07:32.918232] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:17:13.011 [2024-12-15 05:07:32.918238] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:17:13.011 [2024-12-15 05:07:32.918243] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:17:13.011 [2024-12-15 05:07:32.918251] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:13.011 [2024-12-15 05:07:32.918256] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:17:13.011 [2024-12-15 05:07:32.918266] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.586 ms 00:17:13.011 [2024-12-15 05:07:32.918272] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:13.011 [2024-12-15 05:07:32.918343] mngt/ftl_mngt_misc.c: 165:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl0] NV cache data region needs scrubbing, this may take a while. 00:17:13.011 [2024-12-15 05:07:32.918353] mngt/ftl_mngt_misc.c: 166:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl0] Scrubbing 5 chunks 00:17:15.541 [2024-12-15 05:07:35.061782] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.541 [2024-12-15 05:07:35.061841] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Scrub NV cache 00:17:15.541 [2024-12-15 05:07:35.061857] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2143.425 ms 00:17:15.541 [2024-12-15 05:07:35.061866] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.541 [2024-12-15 05:07:35.070477] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.541 [2024-12-15 05:07:35.070517] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:17:15.541 [2024-12-15 05:07:35.070533] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.519 ms 00:17:15.541 [2024-12-15 05:07:35.070542] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.541 [2024-12-15 05:07:35.070647] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.541 [2024-12-15 05:07:35.070656] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:17:15.541 [2024-12-15 05:07:35.070666] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.059 ms 00:17:15.541 [2024-12-15 05:07:35.070673] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.541 [2024-12-15 05:07:35.088000] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.541 [2024-12-15 05:07:35.088059] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:17:15.541 [2024-12-15 05:07:35.088079] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 17.265 ms 00:17:15.541 [2024-12-15 05:07:35.088092] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.541 [2024-12-15 05:07:35.088152] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.541 [2024-12-15 05:07:35.088166] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:17:15.541 [2024-12-15 05:07:35.088180] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:17:15.541 [2024-12-15 05:07:35.088191] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.541 [2024-12-15 05:07:35.088631] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.541 [2024-12-15 05:07:35.088666] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:17:15.541 [2024-12-15 05:07:35.088685] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.359 ms 00:17:15.541 [2024-12-15 05:07:35.088715] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.541 [2024-12-15 05:07:35.088901] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.541 [2024-12-15 05:07:35.088916] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:17:15.541 [2024-12-15 05:07:35.088931] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.140 ms 00:17:15.541 [2024-12-15 05:07:35.088944] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.541 [2024-12-15 05:07:35.095571] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.541 [2024-12-15 05:07:35.095614] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:17:15.541 [2024-12-15 05:07:35.095631] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.587 ms 00:17:15.541 [2024-12-15 05:07:35.095645] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.541 [2024-12-15 05:07:35.104049] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 59 (of 60) MiB 00:17:15.541 [2024-12-15 05:07:35.118625] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.541 [2024-12-15 05:07:35.118672] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:17:15.541 [2024-12-15 05:07:35.118682] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 22.886 ms 00:17:15.541 [2024-12-15 05:07:35.118692] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.541 [2024-12-15 05:07:35.155146] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.541 [2024-12-15 05:07:35.155189] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear L2P 00:17:15.541 [2024-12-15 05:07:35.155200] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 36.419 ms 00:17:15.541 [2024-12-15 05:07:35.155211] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.541 [2024-12-15 05:07:35.155387] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.541 [2024-12-15 05:07:35.155406] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:17:15.541 [2024-12-15 05:07:35.155415] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.135 ms 00:17:15.541 [2024-12-15 05:07:35.155424] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.541 [2024-12-15 05:07:35.158262] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.541 [2024-12-15 05:07:35.158302] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Save initial band info metadata 00:17:15.541 [2024-12-15 05:07:35.158313] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.789 ms 00:17:15.541 [2024-12-15 05:07:35.158324] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.541 [2024-12-15 05:07:35.160601] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.541 [2024-12-15 05:07:35.160637] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Save initial chunk info metadata 00:17:15.541 [2024-12-15 05:07:35.160647] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.235 ms 00:17:15.541 [2024-12-15 05:07:35.160655] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.541 [2024-12-15 05:07:35.160950] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.542 [2024-12-15 05:07:35.160968] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:17:15.542 [2024-12-15 05:07:35.160977] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.260 ms 00:17:15.542 [2024-12-15 05:07:35.160987] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.542 [2024-12-15 05:07:35.181659] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.542 [2024-12-15 05:07:35.181700] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Wipe P2L region 00:17:15.542 [2024-12-15 05:07:35.181711] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 20.647 ms 00:17:15.542 [2024-12-15 05:07:35.181721] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.542 [2024-12-15 05:07:35.185323] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.542 [2024-12-15 05:07:35.185362] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear trim map 00:17:15.542 [2024-12-15 05:07:35.185373] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.533 ms 00:17:15.542 [2024-12-15 05:07:35.185384] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.542 [2024-12-15 05:07:35.188162] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.542 [2024-12-15 05:07:35.188199] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear trim log 00:17:15.542 [2024-12-15 05:07:35.188208] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.660 ms 00:17:15.542 [2024-12-15 05:07:35.188217] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.542 [2024-12-15 05:07:35.191600] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.542 [2024-12-15 05:07:35.191645] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:17:15.542 [2024-12-15 05:07:35.191657] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.344 ms 00:17:15.542 [2024-12-15 05:07:35.191671] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.542 [2024-12-15 05:07:35.191751] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.542 [2024-12-15 05:07:35.191765] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:17:15.542 [2024-12-15 05:07:35.191774] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.006 ms 00:17:15.542 [2024-12-15 05:07:35.191784] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.542 [2024-12-15 05:07:35.191854] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:15.542 [2024-12-15 05:07:35.191872] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:17:15.542 [2024-12-15 05:07:35.191880] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.032 ms 00:17:15.542 [2024-12-15 05:07:35.191889] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:15.542 [2024-12-15 05:07:35.192841] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 2283.854 ms, result 0 00:17:15.542 { 00:17:15.542 "name": "ftl0", 00:17:15.542 "uuid": "c2c96026-3b10-460b-8866-2d7afa6cf328" 00:17:15.542 } 00:17:15.542 05:07:35 ftl.ftl_fio_basic -- ftl/fio.sh@65 -- # waitforbdev ftl0 00:17:15.542 05:07:35 ftl.ftl_fio_basic -- common/autotest_common.sh@903 -- # local bdev_name=ftl0 00:17:15.542 05:07:35 ftl.ftl_fio_basic -- common/autotest_common.sh@904 -- # local bdev_timeout= 00:17:15.542 05:07:35 ftl.ftl_fio_basic -- common/autotest_common.sh@905 -- # local i 00:17:15.542 05:07:35 ftl.ftl_fio_basic -- common/autotest_common.sh@906 -- # [[ -z '' ]] 00:17:15.542 05:07:35 ftl.ftl_fio_basic -- common/autotest_common.sh@906 -- # bdev_timeout=2000 00:17:15.542 05:07:35 ftl.ftl_fio_basic -- common/autotest_common.sh@908 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_wait_for_examine 00:17:15.542 05:07:35 ftl.ftl_fio_basic -- common/autotest_common.sh@910 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b ftl0 -t 2000 00:17:15.542 [ 00:17:15.542 { 00:17:15.542 "name": "ftl0", 00:17:15.542 "aliases": [ 00:17:15.542 "c2c96026-3b10-460b-8866-2d7afa6cf328" 00:17:15.542 ], 00:17:15.542 "product_name": "FTL disk", 00:17:15.542 "block_size": 4096, 00:17:15.542 "num_blocks": 20971520, 00:17:15.542 "uuid": "c2c96026-3b10-460b-8866-2d7afa6cf328", 00:17:15.542 "assigned_rate_limits": { 00:17:15.542 "rw_ios_per_sec": 0, 00:17:15.542 "rw_mbytes_per_sec": 0, 00:17:15.542 "r_mbytes_per_sec": 0, 00:17:15.542 "w_mbytes_per_sec": 0 00:17:15.542 }, 00:17:15.542 "claimed": false, 00:17:15.542 "zoned": false, 00:17:15.542 "supported_io_types": { 00:17:15.542 "read": true, 00:17:15.542 "write": true, 00:17:15.542 "unmap": true, 00:17:15.542 "flush": true, 00:17:15.542 "reset": false, 00:17:15.542 "nvme_admin": false, 00:17:15.542 "nvme_io": false, 00:17:15.542 "nvme_io_md": false, 00:17:15.542 "write_zeroes": true, 00:17:15.542 "zcopy": false, 00:17:15.542 "get_zone_info": false, 00:17:15.542 "zone_management": false, 00:17:15.542 "zone_append": false, 00:17:15.542 "compare": false, 00:17:15.542 "compare_and_write": false, 00:17:15.542 "abort": false, 00:17:15.542 "seek_hole": false, 00:17:15.542 "seek_data": false, 00:17:15.542 "copy": false, 00:17:15.542 "nvme_iov_md": false 00:17:15.542 }, 00:17:15.542 "driver_specific": { 00:17:15.542 "ftl": { 00:17:15.542 "base_bdev": "27e3433d-098c-4889-8b4e-4b95b76fb9fb", 00:17:15.542 "cache": "nvc0n1p0" 00:17:15.542 } 00:17:15.542 } 00:17:15.542 } 00:17:15.542 ] 00:17:15.542 05:07:35 ftl.ftl_fio_basic -- common/autotest_common.sh@911 -- # return 0 00:17:15.542 05:07:35 ftl.ftl_fio_basic -- ftl/fio.sh@68 -- # echo '{"subsystems": [' 00:17:15.542 05:07:35 ftl.ftl_fio_basic -- ftl/fio.sh@69 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py save_subsystem_config -n bdev 00:17:15.800 05:07:35 ftl.ftl_fio_basic -- ftl/fio.sh@70 -- # echo ']}' 00:17:15.800 05:07:35 ftl.ftl_fio_basic -- ftl/fio.sh@73 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_unload -b ftl0 00:17:16.060 [2024-12-15 05:07:35.944753] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.060 [2024-12-15 05:07:35.944792] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:17:16.060 [2024-12-15 05:07:35.944805] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:17:16.060 [2024-12-15 05:07:35.944813] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.060 [2024-12-15 05:07:35.944846] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:17:16.060 [2024-12-15 05:07:35.945297] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.060 [2024-12-15 05:07:35.945338] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:17:16.060 [2024-12-15 05:07:35.945351] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.438 ms 00:17:16.060 [2024-12-15 05:07:35.945372] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.060 [2024-12-15 05:07:35.945868] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.060 [2024-12-15 05:07:35.945888] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:17:16.060 [2024-12-15 05:07:35.945907] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.462 ms 00:17:16.060 [2024-12-15 05:07:35.945927] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.060 [2024-12-15 05:07:35.949175] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.060 [2024-12-15 05:07:35.949203] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:17:16.060 [2024-12-15 05:07:35.949213] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.227 ms 00:17:16.060 [2024-12-15 05:07:35.949223] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.060 [2024-12-15 05:07:35.955446] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.060 [2024-12-15 05:07:35.955478] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:17:16.060 [2024-12-15 05:07:35.955488] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.196 ms 00:17:16.060 [2024-12-15 05:07:35.955497] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.060 [2024-12-15 05:07:35.956964] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.060 [2024-12-15 05:07:35.957004] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:17:16.060 [2024-12-15 05:07:35.957013] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.392 ms 00:17:16.060 [2024-12-15 05:07:35.957025] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.060 [2024-12-15 05:07:35.960366] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.060 [2024-12-15 05:07:35.960408] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:17:16.060 [2024-12-15 05:07:35.960420] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.267 ms 00:17:16.060 [2024-12-15 05:07:35.960430] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.060 [2024-12-15 05:07:35.960625] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.060 [2024-12-15 05:07:35.960643] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:17:16.060 [2024-12-15 05:07:35.960651] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.129 ms 00:17:16.060 [2024-12-15 05:07:35.960660] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.060 [2024-12-15 05:07:35.961980] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.060 [2024-12-15 05:07:35.962014] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:17:16.060 [2024-12-15 05:07:35.962023] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.291 ms 00:17:16.060 [2024-12-15 05:07:35.962032] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.060 [2024-12-15 05:07:35.963022] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.060 [2024-12-15 05:07:35.963055] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:17:16.060 [2024-12-15 05:07:35.963063] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.951 ms 00:17:16.060 [2024-12-15 05:07:35.963072] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.060 [2024-12-15 05:07:35.963889] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.060 [2024-12-15 05:07:35.963923] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:17:16.060 [2024-12-15 05:07:35.963932] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.782 ms 00:17:16.060 [2024-12-15 05:07:35.963940] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.060 [2024-12-15 05:07:35.964837] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.060 [2024-12-15 05:07:35.964874] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:17:16.060 [2024-12-15 05:07:35.964883] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.812 ms 00:17:16.060 [2024-12-15 05:07:35.964891] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.060 [2024-12-15 05:07:35.964932] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:17:16.060 [2024-12-15 05:07:35.964947] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:17:16.060 [2024-12-15 05:07:35.964956] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:17:16.060 [2024-12-15 05:07:35.964966] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:17:16.060 [2024-12-15 05:07:35.964974] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:17:16.060 [2024-12-15 05:07:35.964985] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:17:16.060 [2024-12-15 05:07:35.964992] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:17:16.060 [2024-12-15 05:07:35.965001] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:17:16.060 [2024-12-15 05:07:35.965009] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965018] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965025] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965033] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965041] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965051] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965058] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965067] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965074] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965082] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965090] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965099] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965106] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965116] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965123] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965133] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965141] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965150] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965157] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965166] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965173] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965182] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965190] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965206] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965213] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965226] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965235] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965245] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965252] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965262] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965269] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965278] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965285] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965294] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965301] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965310] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965317] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965326] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965333] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965342] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965349] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965357] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965365] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965375] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965382] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965393] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965400] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965409] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965416] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965425] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965452] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965461] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965469] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965478] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965485] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965494] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965501] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965512] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965520] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965531] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965538] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965549] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965556] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965565] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965573] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965581] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965588] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965597] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965605] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965615] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965622] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965631] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965638] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965646] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965654] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965662] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965669] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965692] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965699] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965708] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965715] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965723] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965730] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965739] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:17:16.061 [2024-12-15 05:07:35.965746] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:17:16.062 [2024-12-15 05:07:35.965755] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:17:16.062 [2024-12-15 05:07:35.965762] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:17:16.062 [2024-12-15 05:07:35.965770] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:17:16.062 [2024-12-15 05:07:35.965778] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:17:16.062 [2024-12-15 05:07:35.965789] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:17:16.062 [2024-12-15 05:07:35.965796] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:17:16.062 [2024-12-15 05:07:35.965806] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:17:16.062 [2024-12-15 05:07:35.965813] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:17:16.062 [2024-12-15 05:07:35.965833] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:17:16.062 [2024-12-15 05:07:35.965843] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: c2c96026-3b10-460b-8866-2d7afa6cf328 00:17:16.062 [2024-12-15 05:07:35.965852] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:17:16.062 [2024-12-15 05:07:35.965869] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:17:16.062 [2024-12-15 05:07:35.965877] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:17:16.062 [2024-12-15 05:07:35.965884] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:17:16.062 [2024-12-15 05:07:35.965892] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:17:16.062 [2024-12-15 05:07:35.965900] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:17:16.062 [2024-12-15 05:07:35.965908] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:17:16.062 [2024-12-15 05:07:35.965915] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:17:16.062 [2024-12-15 05:07:35.965922] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:17:16.062 [2024-12-15 05:07:35.965929] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.062 [2024-12-15 05:07:35.965947] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:17:16.062 [2024-12-15 05:07:35.965955] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.998 ms 00:17:16.062 [2024-12-15 05:07:35.965964] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.967521] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.062 [2024-12-15 05:07:35.967551] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:17:16.062 [2024-12-15 05:07:35.967559] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.528 ms 00:17:16.062 [2024-12-15 05:07:35.967568] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.967673] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:17:16.062 [2024-12-15 05:07:35.967684] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:17:16.062 [2024-12-15 05:07:35.967693] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.054 ms 00:17:16.062 [2024-12-15 05:07:35.967703] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.973120] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:17:16.062 [2024-12-15 05:07:35.973167] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:17:16.062 [2024-12-15 05:07:35.973176] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:17:16.062 [2024-12-15 05:07:35.973186] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.973243] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:17:16.062 [2024-12-15 05:07:35.973253] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:17:16.062 [2024-12-15 05:07:35.973261] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:17:16.062 [2024-12-15 05:07:35.973272] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.973331] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:17:16.062 [2024-12-15 05:07:35.973344] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:17:16.062 [2024-12-15 05:07:35.973351] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:17:16.062 [2024-12-15 05:07:35.973360] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.973381] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:17:16.062 [2024-12-15 05:07:35.973391] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:17:16.062 [2024-12-15 05:07:35.973398] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:17:16.062 [2024-12-15 05:07:35.973406] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.982875] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:17:16.062 [2024-12-15 05:07:35.982923] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:17:16.062 [2024-12-15 05:07:35.982933] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:17:16.062 [2024-12-15 05:07:35.982952] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.990904] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:17:16.062 [2024-12-15 05:07:35.990949] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:17:16.062 [2024-12-15 05:07:35.990958] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:17:16.062 [2024-12-15 05:07:35.990970] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.991038] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:17:16.062 [2024-12-15 05:07:35.991051] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:17:16.062 [2024-12-15 05:07:35.991060] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:17:16.062 [2024-12-15 05:07:35.991069] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.991120] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:17:16.062 [2024-12-15 05:07:35.991130] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:17:16.062 [2024-12-15 05:07:35.991149] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:17:16.062 [2024-12-15 05:07:35.991157] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.991237] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:17:16.062 [2024-12-15 05:07:35.991247] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:17:16.062 [2024-12-15 05:07:35.991255] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:17:16.062 [2024-12-15 05:07:35.991264] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.991306] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:17:16.062 [2024-12-15 05:07:35.991317] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:17:16.062 [2024-12-15 05:07:35.991325] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:17:16.062 [2024-12-15 05:07:35.991333] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.991388] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:17:16.062 [2024-12-15 05:07:35.991401] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:17:16.062 [2024-12-15 05:07:35.991408] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:17:16.062 [2024-12-15 05:07:35.991427] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.991495] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:17:16.062 [2024-12-15 05:07:35.991506] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:17:16.062 [2024-12-15 05:07:35.991515] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:17:16.062 [2024-12-15 05:07:35.991524] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:17:16.062 [2024-12-15 05:07:35.991708] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 46.902 ms, result 0 00:17:16.062 true 00:17:16.062 05:07:36 ftl.ftl_fio_basic -- ftl/fio.sh@75 -- # killprocess 88199 00:17:16.062 05:07:36 ftl.ftl_fio_basic -- common/autotest_common.sh@954 -- # '[' -z 88199 ']' 00:17:16.062 05:07:36 ftl.ftl_fio_basic -- common/autotest_common.sh@958 -- # kill -0 88199 00:17:16.062 05:07:36 ftl.ftl_fio_basic -- common/autotest_common.sh@959 -- # uname 00:17:16.062 05:07:36 ftl.ftl_fio_basic -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:17:16.062 05:07:36 ftl.ftl_fio_basic -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 88199 00:17:16.062 killing process with pid 88199 00:17:16.062 05:07:36 ftl.ftl_fio_basic -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:17:16.062 05:07:36 ftl.ftl_fio_basic -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:17:16.062 05:07:36 ftl.ftl_fio_basic -- common/autotest_common.sh@972 -- # echo 'killing process with pid 88199' 00:17:16.062 05:07:36 ftl.ftl_fio_basic -- common/autotest_common.sh@973 -- # kill 88199 00:17:16.062 05:07:36 ftl.ftl_fio_basic -- common/autotest_common.sh@978 -- # wait 88199 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- ftl/fio.sh@76 -- # trap - SIGINT SIGTERM EXIT 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- ftl/fio.sh@78 -- # for test in ${tests} 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- ftl/fio.sh@79 -- # timing_enter randw-verify 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@726 -- # xtrace_disable 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@10 -- # set +x 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- ftl/fio.sh@80 -- # fio_bdev /home/vagrant/spdk_repo/spdk/test/ftl/config/fio/randw-verify.fio 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev /home/vagrant/spdk_repo/spdk/test/ftl/config/fio/randw-verify.fio 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1343 -- # local sanitizers 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1345 -- # shift 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1347 -- # local asan_lib= 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1349 -- # grep libasan 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1351 -- # break 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:17:21.334 05:07:40 ftl.ftl_fio_basic -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio /home/vagrant/spdk_repo/spdk/test/ftl/config/fio/randw-verify.fio 00:17:21.334 test: (g=0): rw=randwrite, bs=(R) 68.0KiB-68.0KiB, (W) 68.0KiB-68.0KiB, (T) 68.0KiB-68.0KiB, ioengine=spdk_bdev, iodepth=1 00:17:21.334 fio-3.35 00:17:21.334 Starting 1 thread 00:17:29.459 00:17:29.459 test: (groupid=0, jobs=1): err= 0: pid=88357: Sun Dec 15 05:07:48 2024 00:17:29.459 read: IOPS=613, BW=40.7MiB/s (42.7MB/s)(255MiB/6250msec) 00:17:29.459 slat (nsec): min=3916, max=26789, avg=5758.34, stdev=1982.16 00:17:29.459 clat (usec): min=231, max=8472, avg=754.01, stdev=209.11 00:17:29.459 lat (usec): min=235, max=8477, avg=759.77, stdev=209.30 00:17:29.459 clat percentiles (usec): 00:17:29.459 | 1.00th=[ 277], 5.00th=[ 293], 10.00th=[ 486], 20.00th=[ 717], 00:17:29.459 | 30.00th=[ 783], 40.00th=[ 791], 50.00th=[ 799], 60.00th=[ 816], 00:17:29.459 | 70.00th=[ 824], 80.00th=[ 832], 90.00th=[ 857], 95.00th=[ 889], 00:17:29.459 | 99.00th=[ 1074], 99.50th=[ 1139], 99.90th=[ 1369], 99.95th=[ 2606], 00:17:29.459 | 99.99th=[ 8455] 00:17:29.459 write: IOPS=617, BW=41.0MiB/s (43.0MB/s)(256MiB/6246msec); 0 zone resets 00:17:29.459 slat (usec): min=14, max=145, avg=20.01, stdev= 3.92 00:17:29.459 clat (usec): min=260, max=2661, avg=835.17, stdev=187.28 00:17:29.459 lat (usec): min=279, max=2686, avg=855.18, stdev=187.33 00:17:29.459 clat percentiles (usec): 00:17:29.459 | 1.00th=[ 302], 5.00th=[ 363], 10.00th=[ 562], 20.00th=[ 799], 00:17:29.459 | 30.00th=[ 848], 40.00th=[ 857], 50.00th=[ 873], 60.00th=[ 889], 00:17:29.459 | 70.00th=[ 906], 80.00th=[ 914], 90.00th=[ 930], 95.00th=[ 1004], 00:17:29.459 | 99.00th=[ 1352], 99.50th=[ 1598], 99.90th=[ 1745], 99.95th=[ 1827], 00:17:29.459 | 99.99th=[ 2671] 00:17:29.459 bw ( KiB/s): min=39168, max=61472, per=98.67%, avg=41423.33, stdev=6322.43, samples=12 00:17:29.459 iops : min= 576, max= 904, avg=609.17, stdev=92.98, samples=12 00:17:29.459 lat (usec) : 250=0.13%, 500=8.70%, 750=10.42%, 1000=77.21% 00:17:29.459 lat (msec) : 2=3.49%, 4=0.04%, 10=0.01% 00:17:29.459 cpu : usr=99.22%, sys=0.05%, ctx=8, majf=0, minf=1179 00:17:29.459 IO depths : 1=100.0%, 2=0.0%, 4=0.0%, 8=0.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:17:29.459 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:17:29.459 complete : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:17:29.459 issued rwts: total=3833,3856,0,0 short=0,0,0,0 dropped=0,0,0,0 00:17:29.459 latency : target=0, window=0, percentile=100.00%, depth=1 00:17:29.459 00:17:29.459 Run status group 0 (all jobs): 00:17:29.459 READ: bw=40.7MiB/s (42.7MB/s), 40.7MiB/s-40.7MiB/s (42.7MB/s-42.7MB/s), io=255MiB (267MB), run=6250-6250msec 00:17:29.459 WRITE: bw=41.0MiB/s (43.0MB/s), 41.0MiB/s-41.0MiB/s (43.0MB/s-43.0MB/s), io=256MiB (269MB), run=6246-6246msec 00:17:29.459 ----------------------------------------------------- 00:17:29.459 Suppressions used: 00:17:29.459 count bytes template 00:17:29.459 1 5 /usr/src/fio/parse.c 00:17:29.459 1 8 libtcmalloc_minimal.so 00:17:29.459 1 904 libcrypto.so 00:17:29.459 ----------------------------------------------------- 00:17:29.459 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- ftl/fio.sh@81 -- # timing_exit randw-verify 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@732 -- # xtrace_disable 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@10 -- # set +x 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- ftl/fio.sh@78 -- # for test in ${tests} 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- ftl/fio.sh@79 -- # timing_enter randw-verify-j2 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@726 -- # xtrace_disable 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@10 -- # set +x 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- ftl/fio.sh@80 -- # fio_bdev /home/vagrant/spdk_repo/spdk/test/ftl/config/fio/randw-verify-j2.fio 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev /home/vagrant/spdk_repo/spdk/test/ftl/config/fio/randw-verify-j2.fio 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1343 -- # local sanitizers 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1345 -- # shift 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1347 -- # local asan_lib= 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1349 -- # grep libasan 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1351 -- # break 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:17:29.459 05:07:48 ftl.ftl_fio_basic -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio /home/vagrant/spdk_repo/spdk/test/ftl/config/fio/randw-verify-j2.fio 00:17:29.459 first_half: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=128 00:17:29.459 second_half: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=128 00:17:29.459 fio-3.35 00:17:29.459 Starting 2 threads 00:17:56.023 00:17:56.023 first_half: (groupid=0, jobs=1): err= 0: pid=88471: Sun Dec 15 05:08:13 2024 00:17:56.023 read: IOPS=2740, BW=10.7MiB/s (11.2MB/s)(255MiB/23860msec) 00:17:56.023 slat (nsec): min=3083, max=31154, avg=5357.37, stdev=1265.98 00:17:56.023 clat (usec): min=587, max=268428, avg=35512.53, stdev=18014.96 00:17:56.023 lat (usec): min=591, max=268434, avg=35517.89, stdev=18015.01 00:17:56.023 clat percentiles (msec): 00:17:56.023 | 1.00th=[ 10], 5.00th=[ 30], 10.00th=[ 31], 20.00th=[ 31], 00:17:56.023 | 30.00th=[ 31], 40.00th=[ 32], 50.00th=[ 32], 60.00th=[ 32], 00:17:56.023 | 70.00th=[ 34], 80.00th=[ 37], 90.00th=[ 42], 95.00th=[ 51], 00:17:56.023 | 99.00th=[ 142], 99.50th=[ 174], 99.90th=[ 203], 99.95th=[ 230], 00:17:56.023 | 99.99th=[ 262] 00:17:56.023 write: IOPS=2910, BW=11.4MiB/s (11.9MB/s)(256MiB/22514msec); 0 zone resets 00:17:56.023 slat (usec): min=3, max=1496, avg= 7.07, stdev= 8.77 00:17:56.023 clat (usec): min=415, max=76431, avg=11118.56, stdev=17525.39 00:17:56.023 lat (usec): min=423, max=76438, avg=11125.63, stdev=17525.64 00:17:56.023 clat percentiles (usec): 00:17:56.023 | 1.00th=[ 668], 5.00th=[ 791], 10.00th=[ 979], 20.00th=[ 1713], 00:17:56.023 | 30.00th=[ 3326], 40.00th=[ 4359], 50.00th=[ 5211], 60.00th=[ 5800], 00:17:56.023 | 70.00th=[ 7242], 80.00th=[10814], 90.00th=[22414], 95.00th=[62653], 00:17:56.023 | 99.00th=[70779], 99.50th=[72877], 99.90th=[74974], 99.95th=[74974], 00:17:56.023 | 99.99th=[76022] 00:17:56.023 bw ( KiB/s): min= 424, max=40528, per=80.40%, avg=18724.57, stdev=13837.27, samples=28 00:17:56.023 iops : min= 106, max=10132, avg=4681.14, stdev=3459.32, samples=28 00:17:56.023 lat (usec) : 500=0.01%, 750=1.76%, 1000=3.56% 00:17:56.023 lat (msec) : 2=5.41%, 4=7.50%, 10=21.51%, 20=5.73%, 50=47.41% 00:17:56.023 lat (msec) : 100=6.25%, 250=0.86%, 500=0.01% 00:17:56.023 cpu : usr=99.19%, sys=0.13%, ctx=37, majf=0, minf=5611 00:17:56.023 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.1%, >=64=99.8% 00:17:56.023 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:17:56.023 complete : 0=0.0%, 4=100.0%, 8=0.1%, 16=0.1%, 32=0.0%, 64=0.0%, >=64=0.1% 00:17:56.023 issued rwts: total=65382,65536,0,0 short=0,0,0,0 dropped=0,0,0,0 00:17:56.023 latency : target=0, window=0, percentile=100.00%, depth=128 00:17:56.023 second_half: (groupid=0, jobs=1): err= 0: pid=88472: Sun Dec 15 05:08:13 2024 00:17:56.023 read: IOPS=2725, BW=10.6MiB/s (11.2MB/s)(255MiB/23920msec) 00:17:56.023 slat (nsec): min=3158, max=49400, avg=4261.43, stdev=1184.05 00:17:56.023 clat (usec): min=578, max=255629, avg=35949.47, stdev=20231.50 00:17:56.023 lat (usec): min=582, max=255634, avg=35953.73, stdev=20231.72 00:17:56.023 clat percentiles (msec): 00:17:56.023 | 1.00th=[ 6], 5.00th=[ 30], 10.00th=[ 31], 20.00th=[ 31], 00:17:56.023 | 30.00th=[ 31], 40.00th=[ 32], 50.00th=[ 32], 60.00th=[ 32], 00:17:56.023 | 70.00th=[ 34], 80.00th=[ 36], 90.00th=[ 42], 95.00th=[ 56], 00:17:56.023 | 99.00th=[ 140], 99.50th=[ 178], 99.90th=[ 239], 99.95th=[ 251], 00:17:56.023 | 99.99th=[ 255] 00:17:56.023 write: IOPS=3501, BW=13.7MiB/s (14.3MB/s)(256MiB/18715msec); 0 zone resets 00:17:56.023 slat (usec): min=3, max=1403, avg= 6.11, stdev= 9.81 00:17:56.023 clat (usec): min=371, max=76280, avg=10937.45, stdev=18017.72 00:17:56.023 lat (usec): min=377, max=76285, avg=10943.56, stdev=18017.89 00:17:56.023 clat percentiles (usec): 00:17:56.023 | 1.00th=[ 660], 5.00th=[ 742], 10.00th=[ 824], 20.00th=[ 1020], 00:17:56.023 | 30.00th=[ 1205], 40.00th=[ 1860], 50.00th=[ 3195], 60.00th=[ 5211], 00:17:56.023 | 70.00th=[ 8094], 80.00th=[13829], 90.00th=[36963], 95.00th=[62129], 00:17:56.023 | 99.00th=[70779], 99.50th=[71828], 99.90th=[73925], 99.95th=[74974], 00:17:56.023 | 99.99th=[76022] 00:17:56.023 bw ( KiB/s): min= 736, max=62632, per=97.89%, avg=22795.13, stdev=16659.93, samples=23 00:17:56.023 iops : min= 184, max=15658, avg=5698.78, stdev=4164.98, samples=23 00:17:56.023 lat (usec) : 500=0.02%, 750=2.67%, 1000=6.84% 00:17:56.023 lat (msec) : 2=11.16%, 4=6.87%, 10=10.38%, 20=6.67%, 50=48.10% 00:17:56.023 lat (msec) : 100=6.09%, 250=1.19%, 500=0.03% 00:17:56.023 cpu : usr=99.32%, sys=0.14%, ctx=34, majf=0, minf=5535 00:17:56.023 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.1%, >=64=99.9% 00:17:56.023 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:17:56.023 complete : 0=0.0%, 4=100.0%, 8=0.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.1% 00:17:56.023 issued rwts: total=65182,65536,0,0 short=0,0,0,0 dropped=0,0,0,0 00:17:56.023 latency : target=0, window=0, percentile=100.00%, depth=128 00:17:56.023 00:17:56.023 Run status group 0 (all jobs): 00:17:56.023 READ: bw=21.3MiB/s (22.4MB/s), 10.6MiB/s-10.7MiB/s (11.2MB/s-11.2MB/s), io=510MiB (535MB), run=23860-23920msec 00:17:56.023 WRITE: bw=22.7MiB/s (23.8MB/s), 11.4MiB/s-13.7MiB/s (11.9MB/s-14.3MB/s), io=512MiB (537MB), run=18715-22514msec 00:17:56.023 ----------------------------------------------------- 00:17:56.023 Suppressions used: 00:17:56.023 count bytes template 00:17:56.023 2 10 /usr/src/fio/parse.c 00:17:56.023 4 384 /usr/src/fio/iolog.c 00:17:56.023 1 8 libtcmalloc_minimal.so 00:17:56.023 1 904 libcrypto.so 00:17:56.023 ----------------------------------------------------- 00:17:56.023 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- ftl/fio.sh@81 -- # timing_exit randw-verify-j2 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@732 -- # xtrace_disable 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@10 -- # set +x 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- ftl/fio.sh@78 -- # for test in ${tests} 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- ftl/fio.sh@79 -- # timing_enter randw-verify-depth128 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@726 -- # xtrace_disable 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@10 -- # set +x 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- ftl/fio.sh@80 -- # fio_bdev /home/vagrant/spdk_repo/spdk/test/ftl/config/fio/randw-verify-depth128.fio 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1360 -- # fio_plugin /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev /home/vagrant/spdk_repo/spdk/test/ftl/config/fio/randw-verify-depth128.fio 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1341 -- # local fio_dir=/usr/src/fio 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1343 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1343 -- # local sanitizers 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1344 -- # local plugin=/home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1345 -- # shift 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1347 -- # local asan_lib= 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1348 -- # for sanitizer in "${sanitizers[@]}" 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1349 -- # ldd /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1349 -- # awk '{print $3}' 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1349 -- # grep libasan 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1349 -- # asan_lib=/usr/lib64/libasan.so.8 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1350 -- # [[ -n /usr/lib64/libasan.so.8 ]] 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1351 -- # break 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1356 -- # LD_PRELOAD='/usr/lib64/libasan.so.8 /home/vagrant/spdk_repo/spdk/build/fio/spdk_bdev' 00:17:56.023 05:08:14 ftl.ftl_fio_basic -- common/autotest_common.sh@1356 -- # /usr/src/fio/fio /home/vagrant/spdk_repo/spdk/test/ftl/config/fio/randw-verify-depth128.fio 00:17:56.023 test: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=128 00:17:56.023 fio-3.35 00:17:56.023 Starting 1 thread 00:18:14.130 00:18:14.130 test: (groupid=0, jobs=1): err= 0: pid=88779: Sun Dec 15 05:08:31 2024 00:18:14.130 read: IOPS=7098, BW=27.7MiB/s (29.1MB/s)(255MiB/9185msec) 00:18:14.130 slat (nsec): min=3125, max=86951, avg=5123.26, stdev=1384.32 00:18:14.130 clat (usec): min=560, max=40359, avg=18022.77, stdev=2998.81 00:18:14.130 lat (usec): min=568, max=40364, avg=18027.90, stdev=2998.97 00:18:14.130 clat percentiles (usec): 00:18:14.130 | 1.00th=[14877], 5.00th=[15139], 10.00th=[15401], 20.00th=[15664], 00:18:14.130 | 30.00th=[15926], 40.00th=[16450], 50.00th=[16909], 60.00th=[17695], 00:18:14.130 | 70.00th=[19006], 80.00th=[20317], 90.00th=[22152], 95.00th=[23725], 00:18:14.130 | 99.00th=[27395], 99.50th=[30278], 99.90th=[36439], 99.95th=[37487], 00:18:14.130 | 99.99th=[40109] 00:18:14.130 write: IOPS=10.0k, BW=39.1MiB/s (41.0MB/s)(256MiB/6548msec); 0 zone resets 00:18:14.130 slat (usec): min=4, max=469, avg= 7.99, stdev= 4.48 00:18:14.130 clat (usec): min=556, max=60810, avg=12730.99, stdev=14264.70 00:18:14.130 lat (usec): min=562, max=60816, avg=12738.98, stdev=14264.78 00:18:14.130 clat percentiles (usec): 00:18:14.130 | 1.00th=[ 824], 5.00th=[ 1090], 10.00th=[ 1254], 20.00th=[ 1500], 00:18:14.130 | 30.00th=[ 1827], 40.00th=[ 2540], 50.00th=[ 8717], 60.00th=[11731], 00:18:14.130 | 70.00th=[14746], 80.00th=[17695], 90.00th=[39060], 95.00th=[45876], 00:18:14.130 | 99.00th=[51643], 99.50th=[53740], 99.90th=[57410], 99.95th=[58459], 00:18:14.130 | 99.99th=[59507] 00:18:14.130 bw ( KiB/s): min= 2584, max=56400, per=93.53%, avg=37443.57, stdev=12282.26, samples=14 00:18:14.130 iops : min= 646, max=14100, avg=9360.71, stdev=3070.54, samples=14 00:18:14.130 lat (usec) : 750=0.25%, 1000=1.37% 00:18:14.130 lat (msec) : 2=15.08%, 4=4.27%, 10=6.10%, 20=52.91%, 50=19.09% 00:18:14.130 lat (msec) : 100=0.93% 00:18:14.130 cpu : usr=98.84%, sys=0.25%, ctx=25, majf=0, minf=5575 00:18:14.130 IO depths : 1=0.1%, 2=0.1%, 4=0.1%, 8=0.1%, 16=0.1%, 32=0.1%, >=64=99.8% 00:18:14.130 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:18:14.130 complete : 0=0.0%, 4=100.0%, 8=0.1%, 16=0.1%, 32=0.0%, 64=0.0%, >=64=0.1% 00:18:14.130 issued rwts: total=65202,65536,0,0 short=0,0,0,0 dropped=0,0,0,0 00:18:14.130 latency : target=0, window=0, percentile=100.00%, depth=128 00:18:14.130 00:18:14.130 Run status group 0 (all jobs): 00:18:14.130 READ: bw=27.7MiB/s (29.1MB/s), 27.7MiB/s-27.7MiB/s (29.1MB/s-29.1MB/s), io=255MiB (267MB), run=9185-9185msec 00:18:14.130 WRITE: bw=39.1MiB/s (41.0MB/s), 39.1MiB/s-39.1MiB/s (41.0MB/s-41.0MB/s), io=256MiB (268MB), run=6548-6548msec 00:18:14.130 ----------------------------------------------------- 00:18:14.130 Suppressions used: 00:18:14.130 count bytes template 00:18:14.130 1 5 /usr/src/fio/parse.c 00:18:14.130 2 192 /usr/src/fio/iolog.c 00:18:14.130 1 8 libtcmalloc_minimal.so 00:18:14.130 1 904 libcrypto.so 00:18:14.130 ----------------------------------------------------- 00:18:14.130 00:18:14.130 05:08:32 ftl.ftl_fio_basic -- ftl/fio.sh@81 -- # timing_exit randw-verify-depth128 00:18:14.130 05:08:32 ftl.ftl_fio_basic -- common/autotest_common.sh@732 -- # xtrace_disable 00:18:14.130 05:08:32 ftl.ftl_fio_basic -- common/autotest_common.sh@10 -- # set +x 00:18:14.130 05:08:32 ftl.ftl_fio_basic -- ftl/fio.sh@84 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:18:14.130 Remove shared memory files 00:18:14.130 05:08:32 ftl.ftl_fio_basic -- ftl/fio.sh@85 -- # remove_shm 00:18:14.130 05:08:32 ftl.ftl_fio_basic -- ftl/common.sh@204 -- # echo Remove shared memory files 00:18:14.130 05:08:32 ftl.ftl_fio_basic -- ftl/common.sh@205 -- # rm -f rm -f 00:18:14.130 05:08:32 ftl.ftl_fio_basic -- ftl/common.sh@206 -- # rm -f rm -f 00:18:14.130 05:08:32 ftl.ftl_fio_basic -- ftl/common.sh@207 -- # rm -f rm -f /dev/shm/spdk_tgt_trace.pid71206 /dev/shm/spdk_tgt_trace.pid87149 00:18:14.130 05:08:32 ftl.ftl_fio_basic -- ftl/common.sh@208 -- # rm -f rm -f /dev/shm/iscsi 00:18:14.130 05:08:32 ftl.ftl_fio_basic -- ftl/common.sh@209 -- # rm -f rm -f 00:18:14.130 ************************************ 00:18:14.130 END TEST ftl_fio_basic 00:18:14.130 ************************************ 00:18:14.130 00:18:14.130 real 1m3.539s 00:18:14.130 user 2m19.006s 00:18:14.130 sys 0m2.870s 00:18:14.130 05:08:32 ftl.ftl_fio_basic -- common/autotest_common.sh@1130 -- # xtrace_disable 00:18:14.130 05:08:32 ftl.ftl_fio_basic -- common/autotest_common.sh@10 -- # set +x 00:18:14.130 05:08:32 ftl -- ftl/ftl.sh@74 -- # run_test ftl_bdevperf /home/vagrant/spdk_repo/spdk/test/ftl/bdevperf.sh 0000:00:11.0 0000:00:10.0 00:18:14.130 05:08:32 ftl -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:18:14.130 05:08:32 ftl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:18:14.130 05:08:32 ftl -- common/autotest_common.sh@10 -- # set +x 00:18:14.130 ************************************ 00:18:14.130 START TEST ftl_bdevperf 00:18:14.130 ************************************ 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/ftl/bdevperf.sh 0000:00:11.0 0000:00:10.0 00:18:14.130 * Looking for test storage... 00:18:14.130 * Found test storage at /home/vagrant/spdk_repo/spdk/test/ftl 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@1711 -- # lcov --version 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@333 -- # local ver1 ver1_l 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@334 -- # local ver2 ver2_l 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@336 -- # IFS=.-: 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@336 -- # read -ra ver1 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@337 -- # IFS=.-: 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@337 -- # read -ra ver2 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@338 -- # local 'op=<' 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@340 -- # ver1_l=2 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@341 -- # ver2_l=1 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@344 -- # case "$op" in 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@345 -- # : 1 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@364 -- # (( v = 0 )) 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@365 -- # decimal 1 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@353 -- # local d=1 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@355 -- # echo 1 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@365 -- # ver1[v]=1 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@366 -- # decimal 2 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@353 -- # local d=2 00:18:14.130 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@355 -- # echo 2 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@366 -- # ver2[v]=2 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- scripts/common.sh@368 -- # return 0 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:18:14.131 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:18:14.131 --rc genhtml_branch_coverage=1 00:18:14.131 --rc genhtml_function_coverage=1 00:18:14.131 --rc genhtml_legend=1 00:18:14.131 --rc geninfo_all_blocks=1 00:18:14.131 --rc geninfo_unexecuted_blocks=1 00:18:14.131 00:18:14.131 ' 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:18:14.131 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:18:14.131 --rc genhtml_branch_coverage=1 00:18:14.131 --rc genhtml_function_coverage=1 00:18:14.131 --rc genhtml_legend=1 00:18:14.131 --rc geninfo_all_blocks=1 00:18:14.131 --rc geninfo_unexecuted_blocks=1 00:18:14.131 00:18:14.131 ' 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:18:14.131 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:18:14.131 --rc genhtml_branch_coverage=1 00:18:14.131 --rc genhtml_function_coverage=1 00:18:14.131 --rc genhtml_legend=1 00:18:14.131 --rc geninfo_all_blocks=1 00:18:14.131 --rc geninfo_unexecuted_blocks=1 00:18:14.131 00:18:14.131 ' 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:18:14.131 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:18:14.131 --rc genhtml_branch_coverage=1 00:18:14.131 --rc genhtml_function_coverage=1 00:18:14.131 --rc genhtml_legend=1 00:18:14.131 --rc geninfo_all_blocks=1 00:18:14.131 --rc geninfo_unexecuted_blocks=1 00:18:14.131 00:18:14.131 ' 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/bdevperf.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/ftl/common.sh 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@8 -- # dirname /home/vagrant/spdk_repo/spdk/test/ftl/bdevperf.sh 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@8 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@8 -- # testdir=/home/vagrant/spdk_repo/spdk/test/ftl 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@9 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl/../.. 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@9 -- # rootdir=/home/vagrant/spdk_repo/spdk 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@10 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@12 -- # export 'ftl_tgt_core_mask=[0]' 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@12 -- # ftl_tgt_core_mask='[0]' 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@14 -- # export spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@14 -- # spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@15 -- # export 'spdk_tgt_cpumask=[0]' 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@15 -- # spdk_tgt_cpumask='[0]' 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@16 -- # export spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@16 -- # spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@17 -- # export spdk_tgt_pid= 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@17 -- # spdk_tgt_pid= 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@19 -- # export spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@19 -- # spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@20 -- # export 'spdk_ini_cpumask=[1]' 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@20 -- # spdk_ini_cpumask='[1]' 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@21 -- # export spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@21 -- # spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@22 -- # export spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@22 -- # spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@23 -- # export spdk_ini_pid= 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@23 -- # spdk_ini_pid= 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@25 -- # export spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/common.sh@25 -- # spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/bdevperf.sh@11 -- # device=0000:00:11.0 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/bdevperf.sh@12 -- # cache_device=0000:00:10.0 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/bdevperf.sh@13 -- # use_append= 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/bdevperf.sh@14 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/bdevperf.sh@15 -- # timeout=240 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/bdevperf.sh@18 -- # bdevperf_pid=89029 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/bdevperf.sh@20 -- # trap 'killprocess $bdevperf_pid; exit 1' SIGINT SIGTERM EXIT 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/bdevperf.sh@21 -- # waitforlisten 89029 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@835 -- # '[' -z 89029 ']' 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- ftl/bdevperf.sh@17 -- # /home/vagrant/spdk_repo/spdk/build/examples/bdevperf -z -T ftl0 00:18:14.131 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@840 -- # local max_retries=100 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@844 -- # xtrace_disable 00:18:14.131 05:08:33 ftl.ftl_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:18:14.131 [2024-12-15 05:08:33.250302] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:18:14.131 [2024-12-15 05:08:33.250475] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid89029 ] 00:18:14.131 [2024-12-15 05:08:33.411965] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:14.131 [2024-12-15 05:08:33.452262] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:18:14.131 05:08:34 ftl.ftl_bdevperf -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:18:14.131 05:08:34 ftl.ftl_bdevperf -- common/autotest_common.sh@868 -- # return 0 00:18:14.131 05:08:34 ftl.ftl_bdevperf -- ftl/bdevperf.sh@22 -- # create_base_bdev nvme0 0000:00:11.0 103424 00:18:14.131 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@54 -- # local name=nvme0 00:18:14.131 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@55 -- # local base_bdf=0000:00:11.0 00:18:14.131 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@56 -- # local size=103424 00:18:14.131 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@59 -- # local base_bdev 00:18:14.131 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@60 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t PCIe -a 0000:00:11.0 00:18:14.393 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@60 -- # base_bdev=nvme0n1 00:18:14.393 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@62 -- # local base_size 00:18:14.393 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@63 -- # get_bdev_size nvme0n1 00:18:14.393 05:08:34 ftl.ftl_bdevperf -- common/autotest_common.sh@1382 -- # local bdev_name=nvme0n1 00:18:14.393 05:08:34 ftl.ftl_bdevperf -- common/autotest_common.sh@1383 -- # local bdev_info 00:18:14.393 05:08:34 ftl.ftl_bdevperf -- common/autotest_common.sh@1384 -- # local bs 00:18:14.393 05:08:34 ftl.ftl_bdevperf -- common/autotest_common.sh@1385 -- # local nb 00:18:14.393 05:08:34 ftl.ftl_bdevperf -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b nvme0n1 00:18:14.654 05:08:34 ftl.ftl_bdevperf -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:18:14.654 { 00:18:14.654 "name": "nvme0n1", 00:18:14.654 "aliases": [ 00:18:14.654 "1229fa7c-08ef-4907-8f51-3ed8184d4eb1" 00:18:14.654 ], 00:18:14.654 "product_name": "NVMe disk", 00:18:14.654 "block_size": 4096, 00:18:14.654 "num_blocks": 1310720, 00:18:14.654 "uuid": "1229fa7c-08ef-4907-8f51-3ed8184d4eb1", 00:18:14.654 "numa_id": -1, 00:18:14.654 "assigned_rate_limits": { 00:18:14.655 "rw_ios_per_sec": 0, 00:18:14.655 "rw_mbytes_per_sec": 0, 00:18:14.655 "r_mbytes_per_sec": 0, 00:18:14.655 "w_mbytes_per_sec": 0 00:18:14.655 }, 00:18:14.655 "claimed": true, 00:18:14.655 "claim_type": "read_many_write_one", 00:18:14.655 "zoned": false, 00:18:14.655 "supported_io_types": { 00:18:14.655 "read": true, 00:18:14.655 "write": true, 00:18:14.655 "unmap": true, 00:18:14.655 "flush": true, 00:18:14.655 "reset": true, 00:18:14.655 "nvme_admin": true, 00:18:14.655 "nvme_io": true, 00:18:14.655 "nvme_io_md": false, 00:18:14.655 "write_zeroes": true, 00:18:14.655 "zcopy": false, 00:18:14.655 "get_zone_info": false, 00:18:14.655 "zone_management": false, 00:18:14.655 "zone_append": false, 00:18:14.655 "compare": true, 00:18:14.655 "compare_and_write": false, 00:18:14.655 "abort": true, 00:18:14.655 "seek_hole": false, 00:18:14.655 "seek_data": false, 00:18:14.655 "copy": true, 00:18:14.655 "nvme_iov_md": false 00:18:14.655 }, 00:18:14.655 "driver_specific": { 00:18:14.655 "nvme": [ 00:18:14.655 { 00:18:14.655 "pci_address": "0000:00:11.0", 00:18:14.655 "trid": { 00:18:14.655 "trtype": "PCIe", 00:18:14.655 "traddr": "0000:00:11.0" 00:18:14.655 }, 00:18:14.655 "ctrlr_data": { 00:18:14.655 "cntlid": 0, 00:18:14.655 "vendor_id": "0x1b36", 00:18:14.655 "model_number": "QEMU NVMe Ctrl", 00:18:14.655 "serial_number": "12341", 00:18:14.655 "firmware_revision": "8.0.0", 00:18:14.655 "subnqn": "nqn.2019-08.org.qemu:12341", 00:18:14.655 "oacs": { 00:18:14.655 "security": 0, 00:18:14.655 "format": 1, 00:18:14.655 "firmware": 0, 00:18:14.655 "ns_manage": 1 00:18:14.655 }, 00:18:14.655 "multi_ctrlr": false, 00:18:14.655 "ana_reporting": false 00:18:14.655 }, 00:18:14.655 "vs": { 00:18:14.655 "nvme_version": "1.4" 00:18:14.655 }, 00:18:14.655 "ns_data": { 00:18:14.655 "id": 1, 00:18:14.655 "can_share": false 00:18:14.655 } 00:18:14.655 } 00:18:14.655 ], 00:18:14.655 "mp_policy": "active_passive" 00:18:14.655 } 00:18:14.655 } 00:18:14.655 ]' 00:18:14.655 05:08:34 ftl.ftl_bdevperf -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:18:14.655 05:08:34 ftl.ftl_bdevperf -- common/autotest_common.sh@1387 -- # bs=4096 00:18:14.655 05:08:34 ftl.ftl_bdevperf -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:18:14.655 05:08:34 ftl.ftl_bdevperf -- common/autotest_common.sh@1388 -- # nb=1310720 00:18:14.655 05:08:34 ftl.ftl_bdevperf -- common/autotest_common.sh@1391 -- # bdev_size=5120 00:18:14.655 05:08:34 ftl.ftl_bdevperf -- common/autotest_common.sh@1392 -- # echo 5120 00:18:14.655 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@63 -- # base_size=5120 00:18:14.655 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@64 -- # [[ 103424 -le 5120 ]] 00:18:14.655 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@67 -- # clear_lvols 00:18:14.655 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@28 -- # jq -r '.[] | .uuid' 00:18:14.655 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_get_lvstores 00:18:14.916 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@28 -- # stores=ee30c46d-8cc9-4635-81c9-3e11a9c843e5 00:18:14.916 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@29 -- # for lvs in $stores 00:18:14.916 05:08:34 ftl.ftl_bdevperf -- ftl/common.sh@30 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -u ee30c46d-8cc9-4635-81c9-3e11a9c843e5 00:18:15.176 05:08:35 ftl.ftl_bdevperf -- ftl/common.sh@68 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_create_lvstore nvme0n1 lvs 00:18:15.474 05:08:35 ftl.ftl_bdevperf -- ftl/common.sh@68 -- # lvs=bb5d7f31-18e8-453d-88b2-99cec449a4bc 00:18:15.474 05:08:35 ftl.ftl_bdevperf -- ftl/common.sh@69 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_create nvme0n1p0 103424 -t -u bb5d7f31-18e8-453d-88b2-99cec449a4bc 00:18:15.775 05:08:35 ftl.ftl_bdevperf -- ftl/bdevperf.sh@22 -- # split_bdev=9d96c53c-3fc5-41bf-a1f2-e3632b9de32c 00:18:15.775 05:08:35 ftl.ftl_bdevperf -- ftl/bdevperf.sh@23 -- # create_nv_cache_bdev nvc0 0000:00:10.0 9d96c53c-3fc5-41bf-a1f2-e3632b9de32c 00:18:15.775 05:08:35 ftl.ftl_bdevperf -- ftl/common.sh@35 -- # local name=nvc0 00:18:15.775 05:08:35 ftl.ftl_bdevperf -- ftl/common.sh@36 -- # local cache_bdf=0000:00:10.0 00:18:15.775 05:08:35 ftl.ftl_bdevperf -- ftl/common.sh@37 -- # local base_bdev=9d96c53c-3fc5-41bf-a1f2-e3632b9de32c 00:18:15.775 05:08:35 ftl.ftl_bdevperf -- ftl/common.sh@38 -- # local cache_size= 00:18:15.775 05:08:35 ftl.ftl_bdevperf -- ftl/common.sh@41 -- # get_bdev_size 9d96c53c-3fc5-41bf-a1f2-e3632b9de32c 00:18:15.775 05:08:35 ftl.ftl_bdevperf -- common/autotest_common.sh@1382 -- # local bdev_name=9d96c53c-3fc5-41bf-a1f2-e3632b9de32c 00:18:15.775 05:08:35 ftl.ftl_bdevperf -- common/autotest_common.sh@1383 -- # local bdev_info 00:18:15.775 05:08:35 ftl.ftl_bdevperf -- common/autotest_common.sh@1384 -- # local bs 00:18:15.775 05:08:35 ftl.ftl_bdevperf -- common/autotest_common.sh@1385 -- # local nb 00:18:15.775 05:08:35 ftl.ftl_bdevperf -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b 9d96c53c-3fc5-41bf-a1f2-e3632b9de32c 00:18:15.775 05:08:35 ftl.ftl_bdevperf -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:18:15.775 { 00:18:15.775 "name": "9d96c53c-3fc5-41bf-a1f2-e3632b9de32c", 00:18:15.775 "aliases": [ 00:18:15.775 "lvs/nvme0n1p0" 00:18:15.775 ], 00:18:15.775 "product_name": "Logical Volume", 00:18:15.775 "block_size": 4096, 00:18:15.775 "num_blocks": 26476544, 00:18:15.775 "uuid": "9d96c53c-3fc5-41bf-a1f2-e3632b9de32c", 00:18:15.775 "assigned_rate_limits": { 00:18:15.775 "rw_ios_per_sec": 0, 00:18:15.775 "rw_mbytes_per_sec": 0, 00:18:15.775 "r_mbytes_per_sec": 0, 00:18:15.775 "w_mbytes_per_sec": 0 00:18:15.775 }, 00:18:15.775 "claimed": false, 00:18:15.775 "zoned": false, 00:18:15.775 "supported_io_types": { 00:18:15.775 "read": true, 00:18:15.775 "write": true, 00:18:15.775 "unmap": true, 00:18:15.775 "flush": false, 00:18:15.775 "reset": true, 00:18:15.775 "nvme_admin": false, 00:18:15.775 "nvme_io": false, 00:18:15.775 "nvme_io_md": false, 00:18:15.775 "write_zeroes": true, 00:18:15.775 "zcopy": false, 00:18:15.775 "get_zone_info": false, 00:18:15.775 "zone_management": false, 00:18:15.775 "zone_append": false, 00:18:15.775 "compare": false, 00:18:15.775 "compare_and_write": false, 00:18:15.775 "abort": false, 00:18:15.775 "seek_hole": true, 00:18:15.775 "seek_data": true, 00:18:15.776 "copy": false, 00:18:15.776 "nvme_iov_md": false 00:18:15.776 }, 00:18:15.776 "driver_specific": { 00:18:15.776 "lvol": { 00:18:15.776 "lvol_store_uuid": "bb5d7f31-18e8-453d-88b2-99cec449a4bc", 00:18:15.776 "base_bdev": "nvme0n1", 00:18:15.776 "thin_provision": true, 00:18:15.776 "num_allocated_clusters": 0, 00:18:15.776 "snapshot": false, 00:18:15.776 "clone": false, 00:18:15.776 "esnap_clone": false 00:18:15.776 } 00:18:15.776 } 00:18:15.776 } 00:18:15.776 ]' 00:18:15.776 05:08:35 ftl.ftl_bdevperf -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:18:15.776 05:08:35 ftl.ftl_bdevperf -- common/autotest_common.sh@1387 -- # bs=4096 00:18:15.776 05:08:35 ftl.ftl_bdevperf -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:18:15.776 05:08:35 ftl.ftl_bdevperf -- common/autotest_common.sh@1388 -- # nb=26476544 00:18:15.776 05:08:35 ftl.ftl_bdevperf -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:18:15.776 05:08:35 ftl.ftl_bdevperf -- common/autotest_common.sh@1392 -- # echo 103424 00:18:15.776 05:08:35 ftl.ftl_bdevperf -- ftl/common.sh@41 -- # local base_size=5171 00:18:15.776 05:08:35 ftl.ftl_bdevperf -- ftl/common.sh@44 -- # local nvc_bdev 00:18:15.776 05:08:35 ftl.ftl_bdevperf -- ftl/common.sh@45 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvc0 -t PCIe -a 0000:00:10.0 00:18:16.037 05:08:36 ftl.ftl_bdevperf -- ftl/common.sh@45 -- # nvc_bdev=nvc0n1 00:18:16.037 05:08:36 ftl.ftl_bdevperf -- ftl/common.sh@47 -- # [[ -z '' ]] 00:18:16.037 05:08:36 ftl.ftl_bdevperf -- ftl/common.sh@48 -- # get_bdev_size 9d96c53c-3fc5-41bf-a1f2-e3632b9de32c 00:18:16.037 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1382 -- # local bdev_name=9d96c53c-3fc5-41bf-a1f2-e3632b9de32c 00:18:16.037 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1383 -- # local bdev_info 00:18:16.037 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1384 -- # local bs 00:18:16.037 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1385 -- # local nb 00:18:16.037 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b 9d96c53c-3fc5-41bf-a1f2-e3632b9de32c 00:18:16.299 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:18:16.299 { 00:18:16.299 "name": "9d96c53c-3fc5-41bf-a1f2-e3632b9de32c", 00:18:16.299 "aliases": [ 00:18:16.299 "lvs/nvme0n1p0" 00:18:16.299 ], 00:18:16.299 "product_name": "Logical Volume", 00:18:16.299 "block_size": 4096, 00:18:16.299 "num_blocks": 26476544, 00:18:16.299 "uuid": "9d96c53c-3fc5-41bf-a1f2-e3632b9de32c", 00:18:16.299 "assigned_rate_limits": { 00:18:16.299 "rw_ios_per_sec": 0, 00:18:16.299 "rw_mbytes_per_sec": 0, 00:18:16.299 "r_mbytes_per_sec": 0, 00:18:16.299 "w_mbytes_per_sec": 0 00:18:16.299 }, 00:18:16.299 "claimed": false, 00:18:16.299 "zoned": false, 00:18:16.299 "supported_io_types": { 00:18:16.299 "read": true, 00:18:16.299 "write": true, 00:18:16.299 "unmap": true, 00:18:16.299 "flush": false, 00:18:16.299 "reset": true, 00:18:16.299 "nvme_admin": false, 00:18:16.299 "nvme_io": false, 00:18:16.299 "nvme_io_md": false, 00:18:16.299 "write_zeroes": true, 00:18:16.299 "zcopy": false, 00:18:16.299 "get_zone_info": false, 00:18:16.299 "zone_management": false, 00:18:16.299 "zone_append": false, 00:18:16.299 "compare": false, 00:18:16.299 "compare_and_write": false, 00:18:16.299 "abort": false, 00:18:16.299 "seek_hole": true, 00:18:16.299 "seek_data": true, 00:18:16.299 "copy": false, 00:18:16.299 "nvme_iov_md": false 00:18:16.299 }, 00:18:16.299 "driver_specific": { 00:18:16.299 "lvol": { 00:18:16.299 "lvol_store_uuid": "bb5d7f31-18e8-453d-88b2-99cec449a4bc", 00:18:16.299 "base_bdev": "nvme0n1", 00:18:16.299 "thin_provision": true, 00:18:16.299 "num_allocated_clusters": 0, 00:18:16.299 "snapshot": false, 00:18:16.299 "clone": false, 00:18:16.299 "esnap_clone": false 00:18:16.299 } 00:18:16.299 } 00:18:16.299 } 00:18:16.299 ]' 00:18:16.299 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:18:16.299 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1387 -- # bs=4096 00:18:16.299 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:18:16.560 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1388 -- # nb=26476544 00:18:16.560 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:18:16.560 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1392 -- # echo 103424 00:18:16.560 05:08:36 ftl.ftl_bdevperf -- ftl/common.sh@48 -- # cache_size=5171 00:18:16.560 05:08:36 ftl.ftl_bdevperf -- ftl/common.sh@50 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_split_create nvc0n1 -s 5171 1 00:18:16.560 05:08:36 ftl.ftl_bdevperf -- ftl/bdevperf.sh@23 -- # nv_cache=nvc0n1p0 00:18:16.560 05:08:36 ftl.ftl_bdevperf -- ftl/bdevperf.sh@25 -- # get_bdev_size 9d96c53c-3fc5-41bf-a1f2-e3632b9de32c 00:18:16.560 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1382 -- # local bdev_name=9d96c53c-3fc5-41bf-a1f2-e3632b9de32c 00:18:16.560 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1383 -- # local bdev_info 00:18:16.560 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1384 -- # local bs 00:18:16.560 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1385 -- # local nb 00:18:16.560 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b 9d96c53c-3fc5-41bf-a1f2-e3632b9de32c 00:18:16.821 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:18:16.821 { 00:18:16.821 "name": "9d96c53c-3fc5-41bf-a1f2-e3632b9de32c", 00:18:16.821 "aliases": [ 00:18:16.821 "lvs/nvme0n1p0" 00:18:16.821 ], 00:18:16.821 "product_name": "Logical Volume", 00:18:16.821 "block_size": 4096, 00:18:16.821 "num_blocks": 26476544, 00:18:16.821 "uuid": "9d96c53c-3fc5-41bf-a1f2-e3632b9de32c", 00:18:16.821 "assigned_rate_limits": { 00:18:16.821 "rw_ios_per_sec": 0, 00:18:16.821 "rw_mbytes_per_sec": 0, 00:18:16.821 "r_mbytes_per_sec": 0, 00:18:16.821 "w_mbytes_per_sec": 0 00:18:16.821 }, 00:18:16.821 "claimed": false, 00:18:16.821 "zoned": false, 00:18:16.821 "supported_io_types": { 00:18:16.821 "read": true, 00:18:16.821 "write": true, 00:18:16.821 "unmap": true, 00:18:16.821 "flush": false, 00:18:16.821 "reset": true, 00:18:16.821 "nvme_admin": false, 00:18:16.821 "nvme_io": false, 00:18:16.821 "nvme_io_md": false, 00:18:16.821 "write_zeroes": true, 00:18:16.821 "zcopy": false, 00:18:16.821 "get_zone_info": false, 00:18:16.821 "zone_management": false, 00:18:16.821 "zone_append": false, 00:18:16.821 "compare": false, 00:18:16.821 "compare_and_write": false, 00:18:16.821 "abort": false, 00:18:16.821 "seek_hole": true, 00:18:16.821 "seek_data": true, 00:18:16.821 "copy": false, 00:18:16.821 "nvme_iov_md": false 00:18:16.821 }, 00:18:16.821 "driver_specific": { 00:18:16.821 "lvol": { 00:18:16.821 "lvol_store_uuid": "bb5d7f31-18e8-453d-88b2-99cec449a4bc", 00:18:16.821 "base_bdev": "nvme0n1", 00:18:16.821 "thin_provision": true, 00:18:16.821 "num_allocated_clusters": 0, 00:18:16.821 "snapshot": false, 00:18:16.821 "clone": false, 00:18:16.821 "esnap_clone": false 00:18:16.821 } 00:18:16.821 } 00:18:16.821 } 00:18:16.821 ]' 00:18:16.821 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:18:16.821 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1387 -- # bs=4096 00:18:16.821 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:18:16.821 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1388 -- # nb=26476544 00:18:16.821 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:18:16.821 05:08:36 ftl.ftl_bdevperf -- common/autotest_common.sh@1392 -- # echo 103424 00:18:16.821 05:08:36 ftl.ftl_bdevperf -- ftl/bdevperf.sh@25 -- # l2p_dram_size_mb=20 00:18:16.821 05:08:36 ftl.ftl_bdevperf -- ftl/bdevperf.sh@26 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -t 240 bdev_ftl_create -b ftl0 -d 9d96c53c-3fc5-41bf-a1f2-e3632b9de32c -c nvc0n1p0 --l2p_dram_limit 20 00:18:17.083 [2024-12-15 05:08:37.116805] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:17.083 [2024-12-15 05:08:37.116846] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:18:17.083 [2024-12-15 05:08:37.116857] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:18:17.083 [2024-12-15 05:08:37.116864] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:17.083 [2024-12-15 05:08:37.116902] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:17.083 [2024-12-15 05:08:37.116910] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:18:17.083 [2024-12-15 05:08:37.116920] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.027 ms 00:18:17.083 [2024-12-15 05:08:37.116927] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:17.083 [2024-12-15 05:08:37.116942] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:18:17.083 [2024-12-15 05:08:37.119408] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:18:17.083 [2024-12-15 05:08:37.119442] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:17.083 [2024-12-15 05:08:37.119453] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:18:17.083 [2024-12-15 05:08:37.119464] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.503 ms 00:18:17.083 [2024-12-15 05:08:37.119470] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:17.083 [2024-12-15 05:08:37.119526] mngt/ftl_mngt_md.c: 570:ftl_mngt_superblock_init: *NOTICE*: [FTL][ftl0] Create new FTL, UUID 260288ab-3f9f-4615-a46a-d3817b63dea3 00:18:17.083 [2024-12-15 05:08:37.120474] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:17.084 [2024-12-15 05:08:37.120500] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Default-initialize superblock 00:18:17.084 [2024-12-15 05:08:37.120508] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.020 ms 00:18:17.084 [2024-12-15 05:08:37.120517] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:17.084 [2024-12-15 05:08:37.125275] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:17.084 [2024-12-15 05:08:37.125308] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:18:17.084 [2024-12-15 05:08:37.125316] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.730 ms 00:18:17.084 [2024-12-15 05:08:37.125325] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:17.084 [2024-12-15 05:08:37.125380] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:17.084 [2024-12-15 05:08:37.125391] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:18:17.084 [2024-12-15 05:08:37.125400] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.041 ms 00:18:17.084 [2024-12-15 05:08:37.125407] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:17.084 [2024-12-15 05:08:37.125448] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:17.084 [2024-12-15 05:08:37.125457] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:18:17.084 [2024-12-15 05:08:37.125466] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.018 ms 00:18:17.084 [2024-12-15 05:08:37.125473] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:17.084 [2024-12-15 05:08:37.125487] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:18:17.084 [2024-12-15 05:08:37.126763] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:17.084 [2024-12-15 05:08:37.126792] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:18:17.084 [2024-12-15 05:08:37.126803] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.276 ms 00:18:17.084 [2024-12-15 05:08:37.126809] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:17.084 [2024-12-15 05:08:37.126834] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:17.084 [2024-12-15 05:08:37.126841] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:18:17.084 [2024-12-15 05:08:37.126853] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.009 ms 00:18:17.084 [2024-12-15 05:08:37.126858] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:17.084 [2024-12-15 05:08:37.126876] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 1 00:18:17.084 [2024-12-15 05:08:37.126989] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:18:17.084 [2024-12-15 05:08:37.127005] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:18:17.084 [2024-12-15 05:08:37.127014] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:18:17.084 [2024-12-15 05:08:37.127022] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:18:17.084 [2024-12-15 05:08:37.127031] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:18:17.084 [2024-12-15 05:08:37.127038] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:18:17.084 [2024-12-15 05:08:37.127044] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:18:17.084 [2024-12-15 05:08:37.127051] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:18:17.084 [2024-12-15 05:08:37.127058] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:18:17.084 [2024-12-15 05:08:37.127064] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:17.084 [2024-12-15 05:08:37.127070] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:18:17.084 [2024-12-15 05:08:37.127077] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.189 ms 00:18:17.084 [2024-12-15 05:08:37.127082] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:17.084 [2024-12-15 05:08:37.127148] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:17.084 [2024-12-15 05:08:37.127154] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:18:17.084 [2024-12-15 05:08:37.127161] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.053 ms 00:18:17.084 [2024-12-15 05:08:37.127171] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:17.084 [2024-12-15 05:08:37.127243] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:18:17.084 [2024-12-15 05:08:37.127260] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:18:17.084 [2024-12-15 05:08:37.127267] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:18:17.084 [2024-12-15 05:08:37.127273] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:17.084 [2024-12-15 05:08:37.127280] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:18:17.084 [2024-12-15 05:08:37.127285] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:18:17.084 [2024-12-15 05:08:37.127292] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:18:17.084 [2024-12-15 05:08:37.127297] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:18:17.084 [2024-12-15 05:08:37.127303] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:18:17.084 [2024-12-15 05:08:37.127308] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:18:17.084 [2024-12-15 05:08:37.127314] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:18:17.084 [2024-12-15 05:08:37.127319] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:18:17.084 [2024-12-15 05:08:37.127328] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:18:17.084 [2024-12-15 05:08:37.127334] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:18:17.084 [2024-12-15 05:08:37.127340] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:18:17.084 [2024-12-15 05:08:37.127345] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:17.084 [2024-12-15 05:08:37.127352] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:18:17.084 [2024-12-15 05:08:37.127356] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:18:17.084 [2024-12-15 05:08:37.127362] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:17.084 [2024-12-15 05:08:37.127368] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:18:17.084 [2024-12-15 05:08:37.127375] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:18:17.084 [2024-12-15 05:08:37.127380] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:18:17.084 [2024-12-15 05:08:37.127387] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:18:17.084 [2024-12-15 05:08:37.127392] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:18:17.084 [2024-12-15 05:08:37.127398] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:18:17.084 [2024-12-15 05:08:37.127403] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:18:17.084 [2024-12-15 05:08:37.127410] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:18:17.084 [2024-12-15 05:08:37.127416] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:18:17.084 [2024-12-15 05:08:37.127425] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:18:17.084 [2024-12-15 05:08:37.127431] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:18:17.084 [2024-12-15 05:08:37.127447] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:18:17.084 [2024-12-15 05:08:37.127453] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:18:17.084 [2024-12-15 05:08:37.127460] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:18:17.084 [2024-12-15 05:08:37.127466] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:18:17.084 [2024-12-15 05:08:37.127473] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:18:17.084 [2024-12-15 05:08:37.127479] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:18:17.084 [2024-12-15 05:08:37.127486] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:18:17.084 [2024-12-15 05:08:37.127492] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:18:17.084 [2024-12-15 05:08:37.127499] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:18:17.084 [2024-12-15 05:08:37.127505] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:17.084 [2024-12-15 05:08:37.127511] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:18:17.084 [2024-12-15 05:08:37.127517] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:18:17.084 [2024-12-15 05:08:37.127524] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:17.084 [2024-12-15 05:08:37.127530] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:18:17.084 [2024-12-15 05:08:37.127540] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:18:17.084 [2024-12-15 05:08:37.127547] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:18:17.084 [2024-12-15 05:08:37.127557] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:17.084 [2024-12-15 05:08:37.127563] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:18:17.084 [2024-12-15 05:08:37.127570] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:18:17.084 [2024-12-15 05:08:37.127576] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:18:17.084 [2024-12-15 05:08:37.127584] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:18:17.084 [2024-12-15 05:08:37.127589] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:18:17.084 [2024-12-15 05:08:37.127597] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:18:17.084 [2024-12-15 05:08:37.127604] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:18:17.084 [2024-12-15 05:08:37.127613] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:18:17.084 [2024-12-15 05:08:37.127620] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:18:17.084 [2024-12-15 05:08:37.127628] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:18:17.084 [2024-12-15 05:08:37.127634] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:18:17.084 [2024-12-15 05:08:37.127642] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:18:17.084 [2024-12-15 05:08:37.127648] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:18:17.084 [2024-12-15 05:08:37.127656] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:18:17.085 [2024-12-15 05:08:37.127663] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:18:17.085 [2024-12-15 05:08:37.127671] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:18:17.085 [2024-12-15 05:08:37.127677] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:18:17.085 [2024-12-15 05:08:37.127684] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:18:17.085 [2024-12-15 05:08:37.127691] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:18:17.085 [2024-12-15 05:08:37.127698] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:18:17.085 [2024-12-15 05:08:37.127705] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:18:17.085 [2024-12-15 05:08:37.127712] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:18:17.085 [2024-12-15 05:08:37.127718] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:18:17.085 [2024-12-15 05:08:37.127732] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:18:17.085 [2024-12-15 05:08:37.127739] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:18:17.085 [2024-12-15 05:08:37.127747] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:18:17.085 [2024-12-15 05:08:37.127754] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:18:17.085 [2024-12-15 05:08:37.127761] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:18:17.085 [2024-12-15 05:08:37.127768] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:17.085 [2024-12-15 05:08:37.127777] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:18:17.085 [2024-12-15 05:08:37.127784] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.577 ms 00:18:17.085 [2024-12-15 05:08:37.127791] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:17.085 [2024-12-15 05:08:37.127815] mngt/ftl_mngt_misc.c: 165:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl0] NV cache data region needs scrubbing, this may take a while. 00:18:17.085 [2024-12-15 05:08:37.127823] mngt/ftl_mngt_misc.c: 166:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl0] Scrubbing 5 chunks 00:18:21.296 [2024-12-15 05:08:40.812542] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.296 [2024-12-15 05:08:40.812594] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Scrub NV cache 00:18:21.296 [2024-12-15 05:08:40.812609] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3684.712 ms 00:18:21.296 [2024-12-15 05:08:40.812617] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.296 [2024-12-15 05:08:40.820021] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.820067] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:18:21.297 [2024-12-15 05:08:40.820076] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 7.339 ms 00:18:21.297 [2024-12-15 05:08:40.820086] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.820159] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.820173] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:18:21.297 [2024-12-15 05:08:40.820181] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.045 ms 00:18:21.297 [2024-12-15 05:08:40.820188] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.836603] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.836647] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:18:21.297 [2024-12-15 05:08:40.836664] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 16.391 ms 00:18:21.297 [2024-12-15 05:08:40.836674] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.836705] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.836719] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:18:21.297 [2024-12-15 05:08:40.836727] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.002 ms 00:18:21.297 [2024-12-15 05:08:40.836735] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.837071] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.837098] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:18:21.297 [2024-12-15 05:08:40.837107] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.290 ms 00:18:21.297 [2024-12-15 05:08:40.837119] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.837221] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.837238] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:18:21.297 [2024-12-15 05:08:40.837249] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.084 ms 00:18:21.297 [2024-12-15 05:08:40.837265] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.842125] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.842162] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:18:21.297 [2024-12-15 05:08:40.842176] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.845 ms 00:18:21.297 [2024-12-15 05:08:40.842186] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.851241] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 19 (of 20) MiB 00:18:21.297 [2024-12-15 05:08:40.856024] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.856055] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:18:21.297 [2024-12-15 05:08:40.856066] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 13.786 ms 00:18:21.297 [2024-12-15 05:08:40.856072] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.916273] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.916358] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear L2P 00:18:21.297 [2024-12-15 05:08:40.916392] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 60.172 ms 00:18:21.297 [2024-12-15 05:08:40.916415] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.916815] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.916862] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:18:21.297 [2024-12-15 05:08:40.916885] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.307 ms 00:18:21.297 [2024-12-15 05:08:40.916910] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.922943] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.923004] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Save initial band info metadata 00:18:21.297 [2024-12-15 05:08:40.923030] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.989 ms 00:18:21.297 [2024-12-15 05:08:40.923047] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.926854] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.926886] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Save initial chunk info metadata 00:18:21.297 [2024-12-15 05:08:40.926897] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.739 ms 00:18:21.297 [2024-12-15 05:08:40.926904] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.927188] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.927224] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:18:21.297 [2024-12-15 05:08:40.927236] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.255 ms 00:18:21.297 [2024-12-15 05:08:40.927243] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.959980] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.960016] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Wipe P2L region 00:18:21.297 [2024-12-15 05:08:40.960041] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 32.716 ms 00:18:21.297 [2024-12-15 05:08:40.960050] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.964843] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.964876] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear trim map 00:18:21.297 [2024-12-15 05:08:40.964888] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.748 ms 00:18:21.297 [2024-12-15 05:08:40.964895] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.968088] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.968119] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear trim log 00:18:21.297 [2024-12-15 05:08:40.968129] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.154 ms 00:18:21.297 [2024-12-15 05:08:40.968136] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.971528] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.971561] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:18:21.297 [2024-12-15 05:08:40.971573] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.358 ms 00:18:21.297 [2024-12-15 05:08:40.971580] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.971616] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.971633] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:18:21.297 [2024-12-15 05:08:40.971643] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:18:21.297 [2024-12-15 05:08:40.971651] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.971876] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:21.297 [2024-12-15 05:08:40.971893] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:18:21.297 [2024-12-15 05:08:40.971903] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.031 ms 00:18:21.297 [2024-12-15 05:08:40.971910] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:21.297 [2024-12-15 05:08:40.972793] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 3855.570 ms, result 0 00:18:21.297 { 00:18:21.297 "name": "ftl0", 00:18:21.297 "uuid": "260288ab-3f9f-4615-a46a-d3817b63dea3" 00:18:21.297 } 00:18:21.297 05:08:40 ftl.ftl_bdevperf -- ftl/bdevperf.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_get_stats -b ftl0 00:18:21.297 05:08:40 ftl.ftl_bdevperf -- ftl/bdevperf.sh@28 -- # jq -r .name 00:18:21.297 05:08:40 ftl.ftl_bdevperf -- ftl/bdevperf.sh@28 -- # grep -qw ftl0 00:18:21.297 05:08:41 ftl.ftl_bdevperf -- ftl/bdevperf.sh@30 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests -q 1 -w randwrite -t 4 -o 69632 00:18:21.297 [2024-12-15 05:08:41.283298] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on ftl0 00:18:21.297 I/O size of 69632 is greater than zero copy threshold (65536). 00:18:21.297 Zero copy mechanism will not be used. 00:18:21.297 Running I/O for 4 seconds... 00:18:23.184 726.00 IOPS, 48.21 MiB/s [2024-12-15T05:08:44.713Z] 688.00 IOPS, 45.69 MiB/s [2024-12-15T05:08:45.655Z] 720.67 IOPS, 47.86 MiB/s [2024-12-15T05:08:45.655Z] 710.25 IOPS, 47.17 MiB/s 00:18:25.515 Latency(us) 00:18:25.515 [2024-12-15T05:08:45.655Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:25.515 Job: ftl0 (Core Mask 0x1, workload: randwrite, depth: 1, IO size: 69632) 00:18:25.515 ftl0 : 4.00 710.18 47.16 0.00 0.00 1491.07 322.95 3402.83 00:18:25.515 [2024-12-15T05:08:45.655Z] =================================================================================================================== 00:18:25.515 [2024-12-15T05:08:45.655Z] Total : 710.18 47.16 0.00 0.00 1491.07 322.95 3402.83 00:18:25.515 [2024-12-15 05:08:45.291760] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on ftl0 00:18:25.515 { 00:18:25.515 "results": [ 00:18:25.515 { 00:18:25.515 "job": "ftl0", 00:18:25.515 "core_mask": "0x1", 00:18:25.515 "workload": "randwrite", 00:18:25.515 "status": "finished", 00:18:25.515 "queue_depth": 1, 00:18:25.515 "io_size": 69632, 00:18:25.515 "runtime": 4.001807, 00:18:25.515 "iops": 710.1791765569903, 00:18:25.515 "mibps": 47.16033594323764, 00:18:25.515 "io_failed": 0, 00:18:25.515 "io_timeout": 0, 00:18:25.515 "avg_latency_us": 1491.068833432577, 00:18:25.515 "min_latency_us": 322.95384615384614, 00:18:25.515 "max_latency_us": 3402.8307692307694 00:18:25.515 } 00:18:25.515 ], 00:18:25.515 "core_count": 1 00:18:25.515 } 00:18:25.515 05:08:45 ftl.ftl_bdevperf -- ftl/bdevperf.sh@31 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests -q 128 -w randwrite -t 4 -o 4096 00:18:25.515 [2024-12-15 05:08:45.404578] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on ftl0 00:18:25.515 Running I/O for 4 seconds... 00:18:27.402 6236.00 IOPS, 24.36 MiB/s [2024-12-15T05:08:48.488Z] 5956.50 IOPS, 23.27 MiB/s [2024-12-15T05:08:49.433Z] 5569.00 IOPS, 21.75 MiB/s [2024-12-15T05:08:49.694Z] 5373.00 IOPS, 20.99 MiB/s 00:18:29.554 Latency(us) 00:18:29.554 [2024-12-15T05:08:49.694Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:29.554 Job: ftl0 (Core Mask 0x1, workload: randwrite, depth: 128, IO size: 4096) 00:18:29.554 ftl0 : 4.03 5359.82 20.94 0.00 0.00 23786.12 340.28 64527.75 00:18:29.554 [2024-12-15T05:08:49.694Z] =================================================================================================================== 00:18:29.554 [2024-12-15T05:08:49.694Z] Total : 5359.82 20.94 0.00 0.00 23786.12 0.00 64527.75 00:18:29.554 [2024-12-15 05:08:49.445509] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on ftl0 00:18:29.554 { 00:18:29.554 "results": [ 00:18:29.554 { 00:18:29.554 "job": "ftl0", 00:18:29.554 "core_mask": "0x1", 00:18:29.554 "workload": "randwrite", 00:18:29.554 "status": "finished", 00:18:29.554 "queue_depth": 128, 00:18:29.554 "io_size": 4096, 00:18:29.554 "runtime": 4.032972, 00:18:29.554 "iops": 5359.819011885032, 00:18:29.554 "mibps": 20.936793015175905, 00:18:29.554 "io_failed": 0, 00:18:29.554 "io_timeout": 0, 00:18:29.554 "avg_latency_us": 23786.124265786028, 00:18:29.554 "min_latency_us": 340.2830769230769, 00:18:29.554 "max_latency_us": 64527.75384615385 00:18:29.554 } 00:18:29.554 ], 00:18:29.554 "core_count": 1 00:18:29.554 } 00:18:29.554 05:08:49 ftl.ftl_bdevperf -- ftl/bdevperf.sh@32 -- # /home/vagrant/spdk_repo/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests -q 128 -w verify -t 4 -o 4096 00:18:29.554 [2024-12-15 05:08:49.561637] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on ftl0 00:18:29.554 Running I/O for 4 seconds... 00:18:31.442 4411.00 IOPS, 17.23 MiB/s [2024-12-15T05:08:52.971Z] 4518.50 IOPS, 17.65 MiB/s [2024-12-15T05:08:53.915Z] 4486.33 IOPS, 17.52 MiB/s [2024-12-15T05:08:53.915Z] 4504.00 IOPS, 17.59 MiB/s 00:18:33.775 Latency(us) 00:18:33.775 [2024-12-15T05:08:53.915Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:33.775 Job: ftl0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:18:33.775 Verification LBA range: start 0x0 length 0x1400000 00:18:33.775 ftl0 : 4.02 4515.97 17.64 0.00 0.00 28253.12 393.85 42547.99 00:18:33.775 [2024-12-15T05:08:53.915Z] =================================================================================================================== 00:18:33.775 [2024-12-15T05:08:53.915Z] Total : 4515.97 17.64 0.00 0.00 28253.12 0.00 42547.99 00:18:33.775 [2024-12-15 05:08:53.586595] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on ftl0 00:18:33.775 { 00:18:33.775 "results": [ 00:18:33.775 { 00:18:33.775 "job": "ftl0", 00:18:33.775 "core_mask": "0x1", 00:18:33.775 "workload": "verify", 00:18:33.775 "status": "finished", 00:18:33.775 "verify_range": { 00:18:33.775 "start": 0, 00:18:33.775 "length": 20971520 00:18:33.775 }, 00:18:33.775 "queue_depth": 128, 00:18:33.775 "io_size": 4096, 00:18:33.775 "runtime": 4.016412, 00:18:33.775 "iops": 4515.970971105554, 00:18:33.775 "mibps": 17.64051160588107, 00:18:33.775 "io_failed": 0, 00:18:33.775 "io_timeout": 0, 00:18:33.775 "avg_latency_us": 28253.115149664536, 00:18:33.775 "min_latency_us": 393.84615384615387, 00:18:33.775 "max_latency_us": 42547.987692307695 00:18:33.775 } 00:18:33.775 ], 00:18:33.775 "core_count": 1 00:18:33.775 } 00:18:33.775 05:08:53 ftl.ftl_bdevperf -- ftl/bdevperf.sh@34 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_delete -b ftl0 00:18:33.775 [2024-12-15 05:08:53.801235] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:33.775 [2024-12-15 05:08:53.801295] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:18:33.775 [2024-12-15 05:08:53.801311] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:18:33.775 [2024-12-15 05:08:53.801321] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:33.775 [2024-12-15 05:08:53.801349] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:18:33.775 [2024-12-15 05:08:53.802077] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:33.775 [2024-12-15 05:08:53.802118] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:18:33.775 [2024-12-15 05:08:53.802129] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.714 ms 00:18:33.775 [2024-12-15 05:08:53.802146] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:33.775 [2024-12-15 05:08:53.805270] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:33.775 [2024-12-15 05:08:53.805324] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:18:33.775 [2024-12-15 05:08:53.805335] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.098 ms 00:18:33.775 [2024-12-15 05:08:53.805350] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.037 [2024-12-15 05:08:54.028257] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:34.037 [2024-12-15 05:08:54.028328] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:18:34.037 [2024-12-15 05:08:54.028347] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 222.888 ms 00:18:34.037 [2024-12-15 05:08:54.028359] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.037 [2024-12-15 05:08:54.034660] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:34.037 [2024-12-15 05:08:54.034710] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:18:34.037 [2024-12-15 05:08:54.034722] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.247 ms 00:18:34.037 [2024-12-15 05:08:54.034732] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.037 [2024-12-15 05:08:54.037503] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:34.037 [2024-12-15 05:08:54.037563] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:18:34.037 [2024-12-15 05:08:54.037573] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.684 ms 00:18:34.037 [2024-12-15 05:08:54.037583] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.037 [2024-12-15 05:08:54.044091] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:34.037 [2024-12-15 05:08:54.044150] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:18:34.037 [2024-12-15 05:08:54.044162] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.462 ms 00:18:34.037 [2024-12-15 05:08:54.044176] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.037 [2024-12-15 05:08:54.044303] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:34.037 [2024-12-15 05:08:54.044316] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:18:34.037 [2024-12-15 05:08:54.044325] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.081 ms 00:18:34.037 [2024-12-15 05:08:54.044335] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.037 [2024-12-15 05:08:54.047393] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:34.037 [2024-12-15 05:08:54.047463] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:18:34.037 [2024-12-15 05:08:54.047474] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.041 ms 00:18:34.037 [2024-12-15 05:08:54.047484] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.037 [2024-12-15 05:08:54.050310] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:34.037 [2024-12-15 05:08:54.050372] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:18:34.037 [2024-12-15 05:08:54.050383] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.778 ms 00:18:34.037 [2024-12-15 05:08:54.050392] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.037 [2024-12-15 05:08:54.052818] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:34.037 [2024-12-15 05:08:54.052872] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:18:34.037 [2024-12-15 05:08:54.052883] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.380 ms 00:18:34.037 [2024-12-15 05:08:54.052898] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.037 [2024-12-15 05:08:54.055234] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:34.037 [2024-12-15 05:08:54.055292] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:18:34.037 [2024-12-15 05:08:54.055302] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.267 ms 00:18:34.037 [2024-12-15 05:08:54.055311] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.037 [2024-12-15 05:08:54.055353] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:18:34.037 [2024-12-15 05:08:54.055375] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055387] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055398] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055406] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055416] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055424] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055453] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055462] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055471] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055480] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055495] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055502] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055512] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055520] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055529] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055537] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055546] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055553] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055563] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055571] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055581] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055589] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055597] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055605] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055614] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055621] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055633] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055640] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055650] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055660] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055670] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055686] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055696] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:18:34.037 [2024-12-15 05:08:54.055703] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055713] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055721] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055731] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055739] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055749] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055758] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055767] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055784] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055796] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055805] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055815] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055823] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055832] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055840] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055849] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055856] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055866] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055873] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055882] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055890] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055899] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055906] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055915] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055922] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055934] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055942] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055951] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055968] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055979] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055987] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.055997] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056004] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056014] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056022] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056031] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056050] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056059] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056067] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056077] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056084] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056096] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056105] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056116] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056123] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056133] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056140] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056150] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056159] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056169] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056177] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056186] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056194] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056203] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056211] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056220] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056227] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056240] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056248] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056257] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056267] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056277] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056284] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056293] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056300] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056310] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056317] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:18:34.038 [2024-12-15 05:08:54.056336] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:18:34.038 [2024-12-15 05:08:54.056344] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 260288ab-3f9f-4615-a46a-d3817b63dea3 00:18:34.038 [2024-12-15 05:08:54.056354] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:18:34.038 [2024-12-15 05:08:54.056361] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:18:34.038 [2024-12-15 05:08:54.056370] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:18:34.038 [2024-12-15 05:08:54.056378] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:18:34.038 [2024-12-15 05:08:54.056395] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:18:34.038 [2024-12-15 05:08:54.056403] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:18:34.038 [2024-12-15 05:08:54.056412] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:18:34.038 [2024-12-15 05:08:54.056419] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:18:34.038 [2024-12-15 05:08:54.056427] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:18:34.038 [2024-12-15 05:08:54.056457] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:34.038 [2024-12-15 05:08:54.056484] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:18:34.038 [2024-12-15 05:08:54.056500] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.104 ms 00:18:34.038 [2024-12-15 05:08:54.056514] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.038 [2024-12-15 05:08:54.058742] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:34.038 [2024-12-15 05:08:54.058788] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:18:34.038 [2024-12-15 05:08:54.058799] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.206 ms 00:18:34.038 [2024-12-15 05:08:54.058809] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.038 [2024-12-15 05:08:54.058926] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:34.038 [2024-12-15 05:08:54.058941] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:18:34.038 [2024-12-15 05:08:54.058950] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.095 ms 00:18:34.039 [2024-12-15 05:08:54.058962] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.039 [2024-12-15 05:08:54.066815] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:34.039 [2024-12-15 05:08:54.066867] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:18:34.039 [2024-12-15 05:08:54.066878] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:34.039 [2024-12-15 05:08:54.066888] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.039 [2024-12-15 05:08:54.066951] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:34.039 [2024-12-15 05:08:54.066965] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:18:34.039 [2024-12-15 05:08:54.066974] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:34.039 [2024-12-15 05:08:54.066984] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.039 [2024-12-15 05:08:54.067054] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:34.039 [2024-12-15 05:08:54.067068] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:18:34.039 [2024-12-15 05:08:54.067076] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:34.039 [2024-12-15 05:08:54.067086] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.039 [2024-12-15 05:08:54.067100] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:34.039 [2024-12-15 05:08:54.067110] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:18:34.039 [2024-12-15 05:08:54.067120] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:34.039 [2024-12-15 05:08:54.067132] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.039 [2024-12-15 05:08:54.080352] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:34.039 [2024-12-15 05:08:54.080408] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:18:34.039 [2024-12-15 05:08:54.080419] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:34.039 [2024-12-15 05:08:54.080429] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.039 [2024-12-15 05:08:54.091014] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:34.039 [2024-12-15 05:08:54.091068] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:18:34.039 [2024-12-15 05:08:54.091079] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:34.039 [2024-12-15 05:08:54.091096] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.039 [2024-12-15 05:08:54.091158] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:34.039 [2024-12-15 05:08:54.091171] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:18:34.039 [2024-12-15 05:08:54.091179] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:34.039 [2024-12-15 05:08:54.091189] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.039 [2024-12-15 05:08:54.091229] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:34.039 [2024-12-15 05:08:54.091240] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:18:34.039 [2024-12-15 05:08:54.091249] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:34.039 [2024-12-15 05:08:54.091264] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.039 [2024-12-15 05:08:54.091337] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:34.039 [2024-12-15 05:08:54.091349] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:18:34.039 [2024-12-15 05:08:54.091356] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:34.039 [2024-12-15 05:08:54.091365] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.039 [2024-12-15 05:08:54.091398] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:34.039 [2024-12-15 05:08:54.091409] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:18:34.039 [2024-12-15 05:08:54.091421] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:34.039 [2024-12-15 05:08:54.091456] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.039 [2024-12-15 05:08:54.091496] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:34.039 [2024-12-15 05:08:54.091507] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:18:34.039 [2024-12-15 05:08:54.091515] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:34.039 [2024-12-15 05:08:54.091524] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.039 [2024-12-15 05:08:54.091566] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:34.039 [2024-12-15 05:08:54.091579] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:18:34.039 [2024-12-15 05:08:54.091593] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:34.039 [2024-12-15 05:08:54.091611] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:34.039 [2024-12-15 05:08:54.091744] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 290.469 ms, result 0 00:18:34.039 true 00:18:34.039 05:08:54 ftl.ftl_bdevperf -- ftl/bdevperf.sh@36 -- # killprocess 89029 00:18:34.039 05:08:54 ftl.ftl_bdevperf -- common/autotest_common.sh@954 -- # '[' -z 89029 ']' 00:18:34.039 05:08:54 ftl.ftl_bdevperf -- common/autotest_common.sh@958 -- # kill -0 89029 00:18:34.039 05:08:54 ftl.ftl_bdevperf -- common/autotest_common.sh@959 -- # uname 00:18:34.039 05:08:54 ftl.ftl_bdevperf -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:18:34.039 05:08:54 ftl.ftl_bdevperf -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 89029 00:18:34.039 05:08:54 ftl.ftl_bdevperf -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:18:34.039 killing process with pid 89029 00:18:34.039 05:08:54 ftl.ftl_bdevperf -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:18:34.039 05:08:54 ftl.ftl_bdevperf -- common/autotest_common.sh@972 -- # echo 'killing process with pid 89029' 00:18:34.039 Received shutdown signal, test time was about 4.000000 seconds 00:18:34.039 00:18:34.039 Latency(us) 00:18:34.039 [2024-12-15T05:08:54.179Z] Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:18:34.039 [2024-12-15T05:08:54.179Z] =================================================================================================================== 00:18:34.039 [2024-12-15T05:08:54.179Z] Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:18:34.039 05:08:54 ftl.ftl_bdevperf -- common/autotest_common.sh@973 -- # kill 89029 00:18:34.039 05:08:54 ftl.ftl_bdevperf -- common/autotest_common.sh@978 -- # wait 89029 00:18:34.612 05:08:54 ftl.ftl_bdevperf -- ftl/bdevperf.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:18:34.612 05:08:54 ftl.ftl_bdevperf -- ftl/bdevperf.sh@39 -- # remove_shm 00:18:34.612 Remove shared memory files 00:18:34.612 05:08:54 ftl.ftl_bdevperf -- ftl/common.sh@204 -- # echo Remove shared memory files 00:18:34.612 05:08:54 ftl.ftl_bdevperf -- ftl/common.sh@205 -- # rm -f rm -f 00:18:34.612 05:08:54 ftl.ftl_bdevperf -- ftl/common.sh@206 -- # rm -f rm -f 00:18:34.612 05:08:54 ftl.ftl_bdevperf -- ftl/common.sh@207 -- # rm -f rm -f 00:18:34.612 05:08:54 ftl.ftl_bdevperf -- ftl/common.sh@208 -- # rm -f rm -f /dev/shm/iscsi 00:18:34.612 05:08:54 ftl.ftl_bdevperf -- ftl/common.sh@209 -- # rm -f rm -f 00:18:34.612 ************************************ 00:18:34.612 END TEST ftl_bdevperf 00:18:34.612 ************************************ 00:18:34.612 00:18:34.612 real 0m21.464s 00:18:34.612 user 0m24.106s 00:18:34.612 sys 0m0.987s 00:18:34.612 05:08:54 ftl.ftl_bdevperf -- common/autotest_common.sh@1130 -- # xtrace_disable 00:18:34.612 05:08:54 ftl.ftl_bdevperf -- common/autotest_common.sh@10 -- # set +x 00:18:34.612 05:08:54 ftl -- ftl/ftl.sh@75 -- # run_test ftl_trim /home/vagrant/spdk_repo/spdk/test/ftl/trim.sh 0000:00:11.0 0000:00:10.0 00:18:34.612 05:08:54 ftl -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:18:34.612 05:08:54 ftl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:18:34.612 05:08:54 ftl -- common/autotest_common.sh@10 -- # set +x 00:18:34.612 ************************************ 00:18:34.612 START TEST ftl_trim 00:18:34.612 ************************************ 00:18:34.612 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/ftl/trim.sh 0000:00:11.0 0000:00:10.0 00:18:34.612 * Looking for test storage... 00:18:34.612 * Found test storage at /home/vagrant/spdk_repo/spdk/test/ftl 00:18:34.612 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:18:34.612 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@1711 -- # lcov --version 00:18:34.612 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:18:34.612 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@333 -- # local ver1 ver1_l 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@334 -- # local ver2 ver2_l 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@336 -- # IFS=.-: 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@336 -- # read -ra ver1 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@337 -- # IFS=.-: 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@337 -- # read -ra ver2 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@338 -- # local 'op=<' 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@340 -- # ver1_l=2 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@341 -- # ver2_l=1 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@344 -- # case "$op" in 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@345 -- # : 1 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@364 -- # (( v = 0 )) 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@365 -- # decimal 1 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@353 -- # local d=1 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@355 -- # echo 1 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@365 -- # ver1[v]=1 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@366 -- # decimal 2 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@353 -- # local d=2 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@355 -- # echo 2 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@366 -- # ver2[v]=2 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:18:34.612 05:08:54 ftl.ftl_trim -- scripts/common.sh@368 -- # return 0 00:18:34.612 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:18:34.612 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:18:34.612 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:18:34.612 --rc genhtml_branch_coverage=1 00:18:34.612 --rc genhtml_function_coverage=1 00:18:34.612 --rc genhtml_legend=1 00:18:34.612 --rc geninfo_all_blocks=1 00:18:34.612 --rc geninfo_unexecuted_blocks=1 00:18:34.612 00:18:34.612 ' 00:18:34.612 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:18:34.612 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:18:34.612 --rc genhtml_branch_coverage=1 00:18:34.612 --rc genhtml_function_coverage=1 00:18:34.612 --rc genhtml_legend=1 00:18:34.612 --rc geninfo_all_blocks=1 00:18:34.612 --rc geninfo_unexecuted_blocks=1 00:18:34.612 00:18:34.612 ' 00:18:34.612 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:18:34.612 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:18:34.612 --rc genhtml_branch_coverage=1 00:18:34.612 --rc genhtml_function_coverage=1 00:18:34.612 --rc genhtml_legend=1 00:18:34.612 --rc geninfo_all_blocks=1 00:18:34.613 --rc geninfo_unexecuted_blocks=1 00:18:34.613 00:18:34.613 ' 00:18:34.613 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:18:34.613 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:18:34.613 --rc genhtml_branch_coverage=1 00:18:34.613 --rc genhtml_function_coverage=1 00:18:34.613 --rc genhtml_legend=1 00:18:34.613 --rc geninfo_all_blocks=1 00:18:34.613 --rc geninfo_unexecuted_blocks=1 00:18:34.613 00:18:34.613 ' 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@10 -- # source /home/vagrant/spdk_repo/spdk/test/ftl/common.sh 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@8 -- # dirname /home/vagrant/spdk_repo/spdk/test/ftl/trim.sh 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@8 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@8 -- # testdir=/home/vagrant/spdk_repo/spdk/test/ftl 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@9 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl/../.. 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@9 -- # rootdir=/home/vagrant/spdk_repo/spdk 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@10 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@12 -- # export 'ftl_tgt_core_mask=[0]' 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@12 -- # ftl_tgt_core_mask='[0]' 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@14 -- # export spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@14 -- # spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@15 -- # export 'spdk_tgt_cpumask=[0]' 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@15 -- # spdk_tgt_cpumask='[0]' 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@16 -- # export spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@16 -- # spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@17 -- # export spdk_tgt_pid= 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@17 -- # spdk_tgt_pid= 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@19 -- # export spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@19 -- # spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@20 -- # export 'spdk_ini_cpumask=[1]' 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@20 -- # spdk_ini_cpumask='[1]' 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@21 -- # export spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@21 -- # spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@22 -- # export spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@22 -- # spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@23 -- # export spdk_ini_pid= 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@23 -- # spdk_ini_pid= 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@25 -- # export spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/common.sh@25 -- # spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@12 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@23 -- # device=0000:00:11.0 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@24 -- # cache_device=0000:00:10.0 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@25 -- # timeout=240 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@26 -- # data_size_in_blocks=65536 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@27 -- # unmap_size_in_blocks=1024 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@29 -- # [[ y != y ]] 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@34 -- # export FTL_BDEV_NAME=ftl0 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@34 -- # FTL_BDEV_NAME=ftl0 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@35 -- # export FTL_JSON_CONF=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@35 -- # FTL_JSON_CONF=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@37 -- # trap 'fio_kill; exit 1' SIGINT SIGTERM EXIT 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@40 -- # svcpid=89370 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@41 -- # waitforlisten 89370 00:18:34.613 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@835 -- # '[' -z 89370 ']' 00:18:34.613 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:18:34.613 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@840 -- # local max_retries=100 00:18:34.613 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:18:34.613 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:18:34.613 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@844 -- # xtrace_disable 00:18:34.613 05:08:54 ftl.ftl_trim -- common/autotest_common.sh@10 -- # set +x 00:18:34.613 05:08:54 ftl.ftl_trim -- ftl/trim.sh@39 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x7 00:18:34.875 [2024-12-15 05:08:54.806121] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:18:34.875 [2024-12-15 05:08:54.806270] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid89370 ] 00:18:34.875 [2024-12-15 05:08:54.969519] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 3 00:18:34.875 [2024-12-15 05:08:55.001162] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:18:34.875 [2024-12-15 05:08:55.001485] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:18:34.875 [2024-12-15 05:08:55.001573] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 2 00:18:35.821 05:08:55 ftl.ftl_trim -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:18:35.821 05:08:55 ftl.ftl_trim -- common/autotest_common.sh@868 -- # return 0 00:18:35.821 05:08:55 ftl.ftl_trim -- ftl/trim.sh@43 -- # create_base_bdev nvme0 0000:00:11.0 103424 00:18:35.821 05:08:55 ftl.ftl_trim -- ftl/common.sh@54 -- # local name=nvme0 00:18:35.821 05:08:55 ftl.ftl_trim -- ftl/common.sh@55 -- # local base_bdf=0000:00:11.0 00:18:35.821 05:08:55 ftl.ftl_trim -- ftl/common.sh@56 -- # local size=103424 00:18:35.821 05:08:55 ftl.ftl_trim -- ftl/common.sh@59 -- # local base_bdev 00:18:35.821 05:08:55 ftl.ftl_trim -- ftl/common.sh@60 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t PCIe -a 0000:00:11.0 00:18:36.084 05:08:55 ftl.ftl_trim -- ftl/common.sh@60 -- # base_bdev=nvme0n1 00:18:36.084 05:08:55 ftl.ftl_trim -- ftl/common.sh@62 -- # local base_size 00:18:36.084 05:08:55 ftl.ftl_trim -- ftl/common.sh@63 -- # get_bdev_size nvme0n1 00:18:36.084 05:08:55 ftl.ftl_trim -- common/autotest_common.sh@1382 -- # local bdev_name=nvme0n1 00:18:36.084 05:08:55 ftl.ftl_trim -- common/autotest_common.sh@1383 -- # local bdev_info 00:18:36.084 05:08:55 ftl.ftl_trim -- common/autotest_common.sh@1384 -- # local bs 00:18:36.084 05:08:55 ftl.ftl_trim -- common/autotest_common.sh@1385 -- # local nb 00:18:36.084 05:08:55 ftl.ftl_trim -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b nvme0n1 00:18:36.084 05:08:56 ftl.ftl_trim -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:18:36.084 { 00:18:36.084 "name": "nvme0n1", 00:18:36.084 "aliases": [ 00:18:36.084 "5952b19c-2d67-48cc-a6a8-1dccdec8b319" 00:18:36.084 ], 00:18:36.084 "product_name": "NVMe disk", 00:18:36.084 "block_size": 4096, 00:18:36.084 "num_blocks": 1310720, 00:18:36.084 "uuid": "5952b19c-2d67-48cc-a6a8-1dccdec8b319", 00:18:36.084 "numa_id": -1, 00:18:36.084 "assigned_rate_limits": { 00:18:36.084 "rw_ios_per_sec": 0, 00:18:36.084 "rw_mbytes_per_sec": 0, 00:18:36.084 "r_mbytes_per_sec": 0, 00:18:36.084 "w_mbytes_per_sec": 0 00:18:36.084 }, 00:18:36.084 "claimed": true, 00:18:36.084 "claim_type": "read_many_write_one", 00:18:36.084 "zoned": false, 00:18:36.084 "supported_io_types": { 00:18:36.084 "read": true, 00:18:36.084 "write": true, 00:18:36.084 "unmap": true, 00:18:36.084 "flush": true, 00:18:36.084 "reset": true, 00:18:36.084 "nvme_admin": true, 00:18:36.084 "nvme_io": true, 00:18:36.084 "nvme_io_md": false, 00:18:36.084 "write_zeroes": true, 00:18:36.084 "zcopy": false, 00:18:36.084 "get_zone_info": false, 00:18:36.084 "zone_management": false, 00:18:36.084 "zone_append": false, 00:18:36.084 "compare": true, 00:18:36.084 "compare_and_write": false, 00:18:36.084 "abort": true, 00:18:36.084 "seek_hole": false, 00:18:36.084 "seek_data": false, 00:18:36.084 "copy": true, 00:18:36.084 "nvme_iov_md": false 00:18:36.084 }, 00:18:36.084 "driver_specific": { 00:18:36.084 "nvme": [ 00:18:36.084 { 00:18:36.084 "pci_address": "0000:00:11.0", 00:18:36.084 "trid": { 00:18:36.084 "trtype": "PCIe", 00:18:36.084 "traddr": "0000:00:11.0" 00:18:36.084 }, 00:18:36.084 "ctrlr_data": { 00:18:36.084 "cntlid": 0, 00:18:36.084 "vendor_id": "0x1b36", 00:18:36.084 "model_number": "QEMU NVMe Ctrl", 00:18:36.084 "serial_number": "12341", 00:18:36.084 "firmware_revision": "8.0.0", 00:18:36.084 "subnqn": "nqn.2019-08.org.qemu:12341", 00:18:36.084 "oacs": { 00:18:36.084 "security": 0, 00:18:36.084 "format": 1, 00:18:36.084 "firmware": 0, 00:18:36.084 "ns_manage": 1 00:18:36.084 }, 00:18:36.084 "multi_ctrlr": false, 00:18:36.084 "ana_reporting": false 00:18:36.084 }, 00:18:36.084 "vs": { 00:18:36.084 "nvme_version": "1.4" 00:18:36.084 }, 00:18:36.084 "ns_data": { 00:18:36.084 "id": 1, 00:18:36.084 "can_share": false 00:18:36.084 } 00:18:36.084 } 00:18:36.084 ], 00:18:36.084 "mp_policy": "active_passive" 00:18:36.084 } 00:18:36.084 } 00:18:36.084 ]' 00:18:36.084 05:08:56 ftl.ftl_trim -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:18:36.346 05:08:56 ftl.ftl_trim -- common/autotest_common.sh@1387 -- # bs=4096 00:18:36.346 05:08:56 ftl.ftl_trim -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:18:36.346 05:08:56 ftl.ftl_trim -- common/autotest_common.sh@1388 -- # nb=1310720 00:18:36.346 05:08:56 ftl.ftl_trim -- common/autotest_common.sh@1391 -- # bdev_size=5120 00:18:36.346 05:08:56 ftl.ftl_trim -- common/autotest_common.sh@1392 -- # echo 5120 00:18:36.346 05:08:56 ftl.ftl_trim -- ftl/common.sh@63 -- # base_size=5120 00:18:36.346 05:08:56 ftl.ftl_trim -- ftl/common.sh@64 -- # [[ 103424 -le 5120 ]] 00:18:36.346 05:08:56 ftl.ftl_trim -- ftl/common.sh@67 -- # clear_lvols 00:18:36.346 05:08:56 ftl.ftl_trim -- ftl/common.sh@28 -- # jq -r '.[] | .uuid' 00:18:36.346 05:08:56 ftl.ftl_trim -- ftl/common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_get_lvstores 00:18:36.608 05:08:56 ftl.ftl_trim -- ftl/common.sh@28 -- # stores=bb5d7f31-18e8-453d-88b2-99cec449a4bc 00:18:36.608 05:08:56 ftl.ftl_trim -- ftl/common.sh@29 -- # for lvs in $stores 00:18:36.608 05:08:56 ftl.ftl_trim -- ftl/common.sh@30 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -u bb5d7f31-18e8-453d-88b2-99cec449a4bc 00:18:36.608 05:08:56 ftl.ftl_trim -- ftl/common.sh@68 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_create_lvstore nvme0n1 lvs 00:18:36.870 05:08:56 ftl.ftl_trim -- ftl/common.sh@68 -- # lvs=43f5c3f1-af03-43d0-b7e9-91da31196343 00:18:36.870 05:08:56 ftl.ftl_trim -- ftl/common.sh@69 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_create nvme0n1p0 103424 -t -u 43f5c3f1-af03-43d0-b7e9-91da31196343 00:18:37.131 05:08:57 ftl.ftl_trim -- ftl/trim.sh@43 -- # split_bdev=9243ddab-52d2-4e52-82bb-1c6fcb1ef45c 00:18:37.131 05:08:57 ftl.ftl_trim -- ftl/trim.sh@44 -- # create_nv_cache_bdev nvc0 0000:00:10.0 9243ddab-52d2-4e52-82bb-1c6fcb1ef45c 00:18:37.131 05:08:57 ftl.ftl_trim -- ftl/common.sh@35 -- # local name=nvc0 00:18:37.131 05:08:57 ftl.ftl_trim -- ftl/common.sh@36 -- # local cache_bdf=0000:00:10.0 00:18:37.131 05:08:57 ftl.ftl_trim -- ftl/common.sh@37 -- # local base_bdev=9243ddab-52d2-4e52-82bb-1c6fcb1ef45c 00:18:37.132 05:08:57 ftl.ftl_trim -- ftl/common.sh@38 -- # local cache_size= 00:18:37.132 05:08:57 ftl.ftl_trim -- ftl/common.sh@41 -- # get_bdev_size 9243ddab-52d2-4e52-82bb-1c6fcb1ef45c 00:18:37.132 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1382 -- # local bdev_name=9243ddab-52d2-4e52-82bb-1c6fcb1ef45c 00:18:37.132 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1383 -- # local bdev_info 00:18:37.132 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1384 -- # local bs 00:18:37.132 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1385 -- # local nb 00:18:37.132 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b 9243ddab-52d2-4e52-82bb-1c6fcb1ef45c 00:18:37.393 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:18:37.393 { 00:18:37.393 "name": "9243ddab-52d2-4e52-82bb-1c6fcb1ef45c", 00:18:37.393 "aliases": [ 00:18:37.393 "lvs/nvme0n1p0" 00:18:37.393 ], 00:18:37.393 "product_name": "Logical Volume", 00:18:37.393 "block_size": 4096, 00:18:37.393 "num_blocks": 26476544, 00:18:37.393 "uuid": "9243ddab-52d2-4e52-82bb-1c6fcb1ef45c", 00:18:37.393 "assigned_rate_limits": { 00:18:37.393 "rw_ios_per_sec": 0, 00:18:37.393 "rw_mbytes_per_sec": 0, 00:18:37.393 "r_mbytes_per_sec": 0, 00:18:37.393 "w_mbytes_per_sec": 0 00:18:37.393 }, 00:18:37.393 "claimed": false, 00:18:37.393 "zoned": false, 00:18:37.393 "supported_io_types": { 00:18:37.393 "read": true, 00:18:37.393 "write": true, 00:18:37.393 "unmap": true, 00:18:37.393 "flush": false, 00:18:37.393 "reset": true, 00:18:37.393 "nvme_admin": false, 00:18:37.393 "nvme_io": false, 00:18:37.393 "nvme_io_md": false, 00:18:37.393 "write_zeroes": true, 00:18:37.393 "zcopy": false, 00:18:37.393 "get_zone_info": false, 00:18:37.393 "zone_management": false, 00:18:37.393 "zone_append": false, 00:18:37.393 "compare": false, 00:18:37.393 "compare_and_write": false, 00:18:37.393 "abort": false, 00:18:37.393 "seek_hole": true, 00:18:37.393 "seek_data": true, 00:18:37.393 "copy": false, 00:18:37.393 "nvme_iov_md": false 00:18:37.393 }, 00:18:37.393 "driver_specific": { 00:18:37.393 "lvol": { 00:18:37.393 "lvol_store_uuid": "43f5c3f1-af03-43d0-b7e9-91da31196343", 00:18:37.393 "base_bdev": "nvme0n1", 00:18:37.393 "thin_provision": true, 00:18:37.393 "num_allocated_clusters": 0, 00:18:37.393 "snapshot": false, 00:18:37.393 "clone": false, 00:18:37.393 "esnap_clone": false 00:18:37.393 } 00:18:37.393 } 00:18:37.393 } 00:18:37.393 ]' 00:18:37.393 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:18:37.393 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1387 -- # bs=4096 00:18:37.393 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:18:37.393 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1388 -- # nb=26476544 00:18:37.393 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:18:37.393 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1392 -- # echo 103424 00:18:37.393 05:08:57 ftl.ftl_trim -- ftl/common.sh@41 -- # local base_size=5171 00:18:37.393 05:08:57 ftl.ftl_trim -- ftl/common.sh@44 -- # local nvc_bdev 00:18:37.393 05:08:57 ftl.ftl_trim -- ftl/common.sh@45 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvc0 -t PCIe -a 0000:00:10.0 00:18:37.653 05:08:57 ftl.ftl_trim -- ftl/common.sh@45 -- # nvc_bdev=nvc0n1 00:18:37.653 05:08:57 ftl.ftl_trim -- ftl/common.sh@47 -- # [[ -z '' ]] 00:18:37.653 05:08:57 ftl.ftl_trim -- ftl/common.sh@48 -- # get_bdev_size 9243ddab-52d2-4e52-82bb-1c6fcb1ef45c 00:18:37.653 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1382 -- # local bdev_name=9243ddab-52d2-4e52-82bb-1c6fcb1ef45c 00:18:37.653 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1383 -- # local bdev_info 00:18:37.653 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1384 -- # local bs 00:18:37.653 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1385 -- # local nb 00:18:37.653 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b 9243ddab-52d2-4e52-82bb-1c6fcb1ef45c 00:18:37.911 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:18:37.911 { 00:18:37.911 "name": "9243ddab-52d2-4e52-82bb-1c6fcb1ef45c", 00:18:37.912 "aliases": [ 00:18:37.912 "lvs/nvme0n1p0" 00:18:37.912 ], 00:18:37.912 "product_name": "Logical Volume", 00:18:37.912 "block_size": 4096, 00:18:37.912 "num_blocks": 26476544, 00:18:37.912 "uuid": "9243ddab-52d2-4e52-82bb-1c6fcb1ef45c", 00:18:37.912 "assigned_rate_limits": { 00:18:37.912 "rw_ios_per_sec": 0, 00:18:37.912 "rw_mbytes_per_sec": 0, 00:18:37.912 "r_mbytes_per_sec": 0, 00:18:37.912 "w_mbytes_per_sec": 0 00:18:37.912 }, 00:18:37.912 "claimed": false, 00:18:37.912 "zoned": false, 00:18:37.912 "supported_io_types": { 00:18:37.912 "read": true, 00:18:37.912 "write": true, 00:18:37.912 "unmap": true, 00:18:37.912 "flush": false, 00:18:37.912 "reset": true, 00:18:37.912 "nvme_admin": false, 00:18:37.912 "nvme_io": false, 00:18:37.912 "nvme_io_md": false, 00:18:37.912 "write_zeroes": true, 00:18:37.912 "zcopy": false, 00:18:37.912 "get_zone_info": false, 00:18:37.912 "zone_management": false, 00:18:37.912 "zone_append": false, 00:18:37.912 "compare": false, 00:18:37.912 "compare_and_write": false, 00:18:37.912 "abort": false, 00:18:37.912 "seek_hole": true, 00:18:37.912 "seek_data": true, 00:18:37.912 "copy": false, 00:18:37.912 "nvme_iov_md": false 00:18:37.912 }, 00:18:37.912 "driver_specific": { 00:18:37.912 "lvol": { 00:18:37.912 "lvol_store_uuid": "43f5c3f1-af03-43d0-b7e9-91da31196343", 00:18:37.912 "base_bdev": "nvme0n1", 00:18:37.912 "thin_provision": true, 00:18:37.912 "num_allocated_clusters": 0, 00:18:37.912 "snapshot": false, 00:18:37.912 "clone": false, 00:18:37.912 "esnap_clone": false 00:18:37.912 } 00:18:37.912 } 00:18:37.912 } 00:18:37.912 ]' 00:18:37.912 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:18:37.912 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1387 -- # bs=4096 00:18:37.912 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:18:37.912 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1388 -- # nb=26476544 00:18:37.912 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:18:37.912 05:08:57 ftl.ftl_trim -- common/autotest_common.sh@1392 -- # echo 103424 00:18:37.912 05:08:57 ftl.ftl_trim -- ftl/common.sh@48 -- # cache_size=5171 00:18:37.912 05:08:57 ftl.ftl_trim -- ftl/common.sh@50 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_split_create nvc0n1 -s 5171 1 00:18:38.170 05:08:58 ftl.ftl_trim -- ftl/trim.sh@44 -- # nv_cache=nvc0n1p0 00:18:38.170 05:08:58 ftl.ftl_trim -- ftl/trim.sh@46 -- # l2p_percentage=60 00:18:38.170 05:08:58 ftl.ftl_trim -- ftl/trim.sh@47 -- # get_bdev_size 9243ddab-52d2-4e52-82bb-1c6fcb1ef45c 00:18:38.170 05:08:58 ftl.ftl_trim -- common/autotest_common.sh@1382 -- # local bdev_name=9243ddab-52d2-4e52-82bb-1c6fcb1ef45c 00:18:38.170 05:08:58 ftl.ftl_trim -- common/autotest_common.sh@1383 -- # local bdev_info 00:18:38.170 05:08:58 ftl.ftl_trim -- common/autotest_common.sh@1384 -- # local bs 00:18:38.170 05:08:58 ftl.ftl_trim -- common/autotest_common.sh@1385 -- # local nb 00:18:38.170 05:08:58 ftl.ftl_trim -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b 9243ddab-52d2-4e52-82bb-1c6fcb1ef45c 00:18:38.428 05:08:58 ftl.ftl_trim -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:18:38.428 { 00:18:38.428 "name": "9243ddab-52d2-4e52-82bb-1c6fcb1ef45c", 00:18:38.428 "aliases": [ 00:18:38.428 "lvs/nvme0n1p0" 00:18:38.428 ], 00:18:38.428 "product_name": "Logical Volume", 00:18:38.428 "block_size": 4096, 00:18:38.428 "num_blocks": 26476544, 00:18:38.428 "uuid": "9243ddab-52d2-4e52-82bb-1c6fcb1ef45c", 00:18:38.428 "assigned_rate_limits": { 00:18:38.428 "rw_ios_per_sec": 0, 00:18:38.428 "rw_mbytes_per_sec": 0, 00:18:38.428 "r_mbytes_per_sec": 0, 00:18:38.428 "w_mbytes_per_sec": 0 00:18:38.428 }, 00:18:38.428 "claimed": false, 00:18:38.428 "zoned": false, 00:18:38.428 "supported_io_types": { 00:18:38.428 "read": true, 00:18:38.428 "write": true, 00:18:38.428 "unmap": true, 00:18:38.428 "flush": false, 00:18:38.428 "reset": true, 00:18:38.428 "nvme_admin": false, 00:18:38.428 "nvme_io": false, 00:18:38.428 "nvme_io_md": false, 00:18:38.428 "write_zeroes": true, 00:18:38.428 "zcopy": false, 00:18:38.428 "get_zone_info": false, 00:18:38.428 "zone_management": false, 00:18:38.428 "zone_append": false, 00:18:38.428 "compare": false, 00:18:38.428 "compare_and_write": false, 00:18:38.428 "abort": false, 00:18:38.428 "seek_hole": true, 00:18:38.428 "seek_data": true, 00:18:38.428 "copy": false, 00:18:38.428 "nvme_iov_md": false 00:18:38.428 }, 00:18:38.428 "driver_specific": { 00:18:38.428 "lvol": { 00:18:38.428 "lvol_store_uuid": "43f5c3f1-af03-43d0-b7e9-91da31196343", 00:18:38.428 "base_bdev": "nvme0n1", 00:18:38.428 "thin_provision": true, 00:18:38.428 "num_allocated_clusters": 0, 00:18:38.428 "snapshot": false, 00:18:38.428 "clone": false, 00:18:38.428 "esnap_clone": false 00:18:38.428 } 00:18:38.428 } 00:18:38.428 } 00:18:38.428 ]' 00:18:38.428 05:08:58 ftl.ftl_trim -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:18:38.428 05:08:58 ftl.ftl_trim -- common/autotest_common.sh@1387 -- # bs=4096 00:18:38.428 05:08:58 ftl.ftl_trim -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:18:38.428 05:08:58 ftl.ftl_trim -- common/autotest_common.sh@1388 -- # nb=26476544 00:18:38.428 05:08:58 ftl.ftl_trim -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:18:38.428 05:08:58 ftl.ftl_trim -- common/autotest_common.sh@1392 -- # echo 103424 00:18:38.428 05:08:58 ftl.ftl_trim -- ftl/trim.sh@47 -- # l2p_dram_size_mb=60 00:18:38.428 05:08:58 ftl.ftl_trim -- ftl/trim.sh@49 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -t 240 bdev_ftl_create -b ftl0 -d 9243ddab-52d2-4e52-82bb-1c6fcb1ef45c -c nvc0n1p0 --core_mask 7 --l2p_dram_limit 60 --overprovisioning 10 00:18:38.687 [2024-12-15 05:08:58.634815] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:38.687 [2024-12-15 05:08:58.634860] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:18:38.687 [2024-12-15 05:08:58.634872] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:18:38.687 [2024-12-15 05:08:58.634885] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:38.687 [2024-12-15 05:08:58.637299] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:38.687 [2024-12-15 05:08:58.637333] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:18:38.687 [2024-12-15 05:08:58.637343] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.383 ms 00:18:38.687 [2024-12-15 05:08:58.637353] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:38.687 [2024-12-15 05:08:58.637472] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:18:38.687 [2024-12-15 05:08:58.637708] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:18:38.687 [2024-12-15 05:08:58.637723] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:38.687 [2024-12-15 05:08:58.637732] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:18:38.687 [2024-12-15 05:08:58.637751] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.258 ms 00:18:38.687 [2024-12-15 05:08:58.637775] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:38.687 [2024-12-15 05:08:58.637867] mngt/ftl_mngt_md.c: 570:ftl_mngt_superblock_init: *NOTICE*: [FTL][ftl0] Create new FTL, UUID 3af9b670-83ba-4af2-8311-25d469da50d8 00:18:38.687 [2024-12-15 05:08:58.639050] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:38.687 [2024-12-15 05:08:58.639082] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Default-initialize superblock 00:18:38.687 [2024-12-15 05:08:58.639094] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.026 ms 00:18:38.687 [2024-12-15 05:08:58.639102] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:38.687 [2024-12-15 05:08:58.644638] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:38.687 [2024-12-15 05:08:58.644769] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:18:38.687 [2024-12-15 05:08:58.644786] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.454 ms 00:18:38.687 [2024-12-15 05:08:58.644793] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:38.687 [2024-12-15 05:08:58.644900] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:38.687 [2024-12-15 05:08:58.644910] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:18:38.687 [2024-12-15 05:08:58.644920] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.057 ms 00:18:38.687 [2024-12-15 05:08:58.644929] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:38.687 [2024-12-15 05:08:58.644966] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:38.688 [2024-12-15 05:08:58.644974] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:18:38.688 [2024-12-15 05:08:58.644984] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.008 ms 00:18:38.688 [2024-12-15 05:08:58.644990] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:38.688 [2024-12-15 05:08:58.645024] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on ftl_core_thread 00:18:38.688 [2024-12-15 05:08:58.646481] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:38.688 [2024-12-15 05:08:58.646503] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:18:38.688 [2024-12-15 05:08:58.646514] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.464 ms 00:18:38.688 [2024-12-15 05:08:58.646523] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:38.688 [2024-12-15 05:08:58.646576] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:38.688 [2024-12-15 05:08:58.646599] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:18:38.688 [2024-12-15 05:08:58.646607] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.012 ms 00:18:38.688 [2024-12-15 05:08:58.646618] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:38.688 [2024-12-15 05:08:58.646647] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 1 00:18:38.688 [2024-12-15 05:08:58.646786] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:18:38.688 [2024-12-15 05:08:58.646797] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:18:38.688 [2024-12-15 05:08:58.646811] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:18:38.688 [2024-12-15 05:08:58.646820] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:18:38.688 [2024-12-15 05:08:58.646831] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:18:38.688 [2024-12-15 05:08:58.646838] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 23592960 00:18:38.688 [2024-12-15 05:08:58.646847] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:18:38.688 [2024-12-15 05:08:58.646854] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:18:38.688 [2024-12-15 05:08:58.646863] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:18:38.688 [2024-12-15 05:08:58.646872] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:38.688 [2024-12-15 05:08:58.646881] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:18:38.688 [2024-12-15 05:08:58.646888] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.226 ms 00:18:38.688 [2024-12-15 05:08:58.646897] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:38.688 [2024-12-15 05:08:58.646995] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:38.688 [2024-12-15 05:08:58.647010] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:18:38.688 [2024-12-15 05:08:58.647018] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.069 ms 00:18:38.688 [2024-12-15 05:08:58.647026] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:38.688 [2024-12-15 05:08:58.647148] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:18:38.688 [2024-12-15 05:08:58.647161] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:18:38.688 [2024-12-15 05:08:58.647170] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:18:38.688 [2024-12-15 05:08:58.647179] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:38.688 [2024-12-15 05:08:58.647199] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:18:38.688 [2024-12-15 05:08:58.647210] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:18:38.688 [2024-12-15 05:08:58.647218] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 90.00 MiB 00:18:38.688 [2024-12-15 05:08:58.647227] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:18:38.688 [2024-12-15 05:08:58.647235] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 90.12 MiB 00:18:38.688 [2024-12-15 05:08:58.647245] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:18:38.688 [2024-12-15 05:08:58.647253] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:18:38.688 [2024-12-15 05:08:58.647263] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 90.62 MiB 00:18:38.688 [2024-12-15 05:08:58.647270] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:18:38.688 [2024-12-15 05:08:58.647281] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:18:38.688 [2024-12-15 05:08:58.647289] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.88 MiB 00:18:38.688 [2024-12-15 05:08:58.647298] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:38.688 [2024-12-15 05:08:58.647305] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:18:38.688 [2024-12-15 05:08:58.647314] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 124.00 MiB 00:18:38.688 [2024-12-15 05:08:58.647321] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:38.688 [2024-12-15 05:08:58.647332] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:18:38.688 [2024-12-15 05:08:58.647340] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 91.12 MiB 00:18:38.688 [2024-12-15 05:08:58.647349] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:18:38.688 [2024-12-15 05:08:58.647357] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:18:38.688 [2024-12-15 05:08:58.647366] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 99.12 MiB 00:18:38.688 [2024-12-15 05:08:58.647373] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:18:38.688 [2024-12-15 05:08:58.647382] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:18:38.688 [2024-12-15 05:08:58.647390] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 107.12 MiB 00:18:38.688 [2024-12-15 05:08:58.647400] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:18:38.688 [2024-12-15 05:08:58.647407] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:18:38.688 [2024-12-15 05:08:58.647418] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 115.12 MiB 00:18:38.688 [2024-12-15 05:08:58.647425] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:18:38.688 [2024-12-15 05:08:58.647452] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:18:38.688 [2024-12-15 05:08:58.647460] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.12 MiB 00:18:38.688 [2024-12-15 05:08:58.647470] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:18:38.688 [2024-12-15 05:08:58.647478] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:18:38.688 [2024-12-15 05:08:58.647488] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.38 MiB 00:18:38.688 [2024-12-15 05:08:58.647495] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:18:38.688 [2024-12-15 05:08:58.647505] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:18:38.688 [2024-12-15 05:08:58.647513] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.62 MiB 00:18:38.688 [2024-12-15 05:08:58.647522] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:38.688 [2024-12-15 05:08:58.647529] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:18:38.688 [2024-12-15 05:08:58.647539] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.75 MiB 00:18:38.688 [2024-12-15 05:08:58.647546] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:38.688 [2024-12-15 05:08:58.647555] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:18:38.688 [2024-12-15 05:08:58.647564] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:18:38.688 [2024-12-15 05:08:58.647575] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:18:38.688 [2024-12-15 05:08:58.647584] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:38.688 [2024-12-15 05:08:58.647603] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:18:38.688 [2024-12-15 05:08:58.647609] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:18:38.688 [2024-12-15 05:08:58.647618] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:18:38.688 [2024-12-15 05:08:58.647624] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:18:38.688 [2024-12-15 05:08:58.647632] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:18:38.688 [2024-12-15 05:08:58.647640] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:18:38.688 [2024-12-15 05:08:58.647649] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:18:38.688 [2024-12-15 05:08:58.647658] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:18:38.688 [2024-12-15 05:08:58.647668] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5a00 00:18:38.688 [2024-12-15 05:08:58.647675] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5a20 blk_sz:0x80 00:18:38.688 [2024-12-15 05:08:58.647685] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x5aa0 blk_sz:0x80 00:18:38.689 [2024-12-15 05:08:58.647694] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5b20 blk_sz:0x800 00:18:38.689 [2024-12-15 05:08:58.647702] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x6320 blk_sz:0x800 00:18:38.689 [2024-12-15 05:08:58.647709] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6b20 blk_sz:0x800 00:18:38.689 [2024-12-15 05:08:58.647720] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x7320 blk_sz:0x800 00:18:38.689 [2024-12-15 05:08:58.647726] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7b20 blk_sz:0x40 00:18:38.689 [2024-12-15 05:08:58.647735] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7b60 blk_sz:0x40 00:18:38.689 [2024-12-15 05:08:58.647742] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x7ba0 blk_sz:0x20 00:18:38.689 [2024-12-15 05:08:58.647750] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x7bc0 blk_sz:0x20 00:18:38.689 [2024-12-15 05:08:58.647757] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x7be0 blk_sz:0x20 00:18:38.689 [2024-12-15 05:08:58.647766] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7c00 blk_sz:0x20 00:18:38.689 [2024-12-15 05:08:58.647773] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7c20 blk_sz:0x13b6e0 00:18:38.689 [2024-12-15 05:08:58.647781] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:18:38.689 [2024-12-15 05:08:58.647791] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:18:38.689 [2024-12-15 05:08:58.647801] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:18:38.689 [2024-12-15 05:08:58.647808] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:18:38.689 [2024-12-15 05:08:58.647816] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:18:38.689 [2024-12-15 05:08:58.647823] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:18:38.689 [2024-12-15 05:08:58.647833] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:38.689 [2024-12-15 05:08:58.647840] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:18:38.689 [2024-12-15 05:08:58.647850] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.749 ms 00:18:38.689 [2024-12-15 05:08:58.647857] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:38.689 [2024-12-15 05:08:58.647962] mngt/ftl_mngt_misc.c: 165:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl0] NV cache data region needs scrubbing, this may take a while. 00:18:38.689 [2024-12-15 05:08:58.647982] mngt/ftl_mngt_misc.c: 166:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl0] Scrubbing 5 chunks 00:18:42.052 [2024-12-15 05:09:01.361020] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.052 [2024-12-15 05:09:01.361211] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Scrub NV cache 00:18:42.052 [2024-12-15 05:09:01.361238] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2713.044 ms 00:18:42.052 [2024-12-15 05:09:01.361248] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.052 [2024-12-15 05:09:01.369571] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.052 [2024-12-15 05:09:01.369604] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:18:42.052 [2024-12-15 05:09:01.369618] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.227 ms 00:18:42.052 [2024-12-15 05:09:01.369626] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.052 [2024-12-15 05:09:01.369763] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.052 [2024-12-15 05:09:01.369774] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:18:42.052 [2024-12-15 05:09:01.369784] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.064 ms 00:18:42.052 [2024-12-15 05:09:01.369808] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.052 [2024-12-15 05:09:01.386144] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.052 [2024-12-15 05:09:01.386184] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:18:42.052 [2024-12-15 05:09:01.386198] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 16.301 ms 00:18:42.052 [2024-12-15 05:09:01.386206] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.052 [2024-12-15 05:09:01.386291] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.052 [2024-12-15 05:09:01.386302] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:18:42.052 [2024-12-15 05:09:01.386317] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:18:42.052 [2024-12-15 05:09:01.386324] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.052 [2024-12-15 05:09:01.386674] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.052 [2024-12-15 05:09:01.386689] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:18:42.052 [2024-12-15 05:09:01.386700] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.312 ms 00:18:42.052 [2024-12-15 05:09:01.386708] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.052 [2024-12-15 05:09:01.386847] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.052 [2024-12-15 05:09:01.386857] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:18:42.052 [2024-12-15 05:09:01.386868] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.100 ms 00:18:42.052 [2024-12-15 05:09:01.386889] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.052 [2024-12-15 05:09:01.392608] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.052 [2024-12-15 05:09:01.392752] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:18:42.052 [2024-12-15 05:09:01.392774] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.680 ms 00:18:42.052 [2024-12-15 05:09:01.392782] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.052 [2024-12-15 05:09:01.402003] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 59 (of 60) MiB 00:18:42.052 [2024-12-15 05:09:01.416053] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.052 [2024-12-15 05:09:01.416178] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:18:42.052 [2024-12-15 05:09:01.416193] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 23.160 ms 00:18:42.052 [2024-12-15 05:09:01.416203] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.052 [2024-12-15 05:09:01.493959] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.052 [2024-12-15 05:09:01.494067] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear L2P 00:18:42.052 [2024-12-15 05:09:01.494100] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 77.685 ms 00:18:42.052 [2024-12-15 05:09:01.494131] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.052 [2024-12-15 05:09:01.494663] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.052 [2024-12-15 05:09:01.494715] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:18:42.052 [2024-12-15 05:09:01.494742] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.422 ms 00:18:42.052 [2024-12-15 05:09:01.494770] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.052 [2024-12-15 05:09:01.499250] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.052 [2024-12-15 05:09:01.499286] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Save initial band info metadata 00:18:42.052 [2024-12-15 05:09:01.499296] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.405 ms 00:18:42.052 [2024-12-15 05:09:01.499306] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.052 [2024-12-15 05:09:01.502067] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.052 [2024-12-15 05:09:01.502199] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Save initial chunk info metadata 00:18:42.052 [2024-12-15 05:09:01.502215] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.730 ms 00:18:42.052 [2024-12-15 05:09:01.502224] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.052 [2024-12-15 05:09:01.502550] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.052 [2024-12-15 05:09:01.502564] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:18:42.052 [2024-12-15 05:09:01.502585] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.272 ms 00:18:42.052 [2024-12-15 05:09:01.502596] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.053 [2024-12-15 05:09:01.533827] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.053 [2024-12-15 05:09:01.533862] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Wipe P2L region 00:18:42.053 [2024-12-15 05:09:01.533872] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 31.184 ms 00:18:42.053 [2024-12-15 05:09:01.533884] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.053 [2024-12-15 05:09:01.538031] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.053 [2024-12-15 05:09:01.538065] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear trim map 00:18:42.053 [2024-12-15 05:09:01.538075] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.066 ms 00:18:42.053 [2024-12-15 05:09:01.538085] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.053 [2024-12-15 05:09:01.541858] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.053 [2024-12-15 05:09:01.541889] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear trim log 00:18:42.053 [2024-12-15 05:09:01.541897] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.708 ms 00:18:42.053 [2024-12-15 05:09:01.541906] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.053 [2024-12-15 05:09:01.546020] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.053 [2024-12-15 05:09:01.546055] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:18:42.053 [2024-12-15 05:09:01.546064] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.070 ms 00:18:42.053 [2024-12-15 05:09:01.546074] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.053 [2024-12-15 05:09:01.546128] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.053 [2024-12-15 05:09:01.546139] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:18:42.053 [2024-12-15 05:09:01.546147] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:18:42.053 [2024-12-15 05:09:01.546156] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.053 [2024-12-15 05:09:01.546236] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.053 [2024-12-15 05:09:01.546246] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:18:42.053 [2024-12-15 05:09:01.546254] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.031 ms 00:18:42.053 [2024-12-15 05:09:01.546263] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.053 [2024-12-15 05:09:01.547067] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:18:42.053 [2024-12-15 05:09:01.548036] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 2911.989 ms, result 0 00:18:42.053 [2024-12-15 05:09:01.549095] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on ap{ 00:18:42.053 "name": "ftl0", 00:18:42.053 "uuid": "3af9b670-83ba-4af2-8311-25d469da50d8" 00:18:42.053 } 00:18:42.053 p_thread 00:18:42.053 05:09:01 ftl.ftl_trim -- ftl/trim.sh@51 -- # waitforbdev ftl0 00:18:42.053 05:09:01 ftl.ftl_trim -- common/autotest_common.sh@903 -- # local bdev_name=ftl0 00:18:42.053 05:09:01 ftl.ftl_trim -- common/autotest_common.sh@904 -- # local bdev_timeout= 00:18:42.053 05:09:01 ftl.ftl_trim -- common/autotest_common.sh@905 -- # local i 00:18:42.053 05:09:01 ftl.ftl_trim -- common/autotest_common.sh@906 -- # [[ -z '' ]] 00:18:42.053 05:09:01 ftl.ftl_trim -- common/autotest_common.sh@906 -- # bdev_timeout=2000 00:18:42.053 05:09:01 ftl.ftl_trim -- common/autotest_common.sh@908 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_wait_for_examine 00:18:42.053 05:09:01 ftl.ftl_trim -- common/autotest_common.sh@910 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b ftl0 -t 2000 00:18:42.053 [ 00:18:42.053 { 00:18:42.053 "name": "ftl0", 00:18:42.053 "aliases": [ 00:18:42.053 "3af9b670-83ba-4af2-8311-25d469da50d8" 00:18:42.053 ], 00:18:42.053 "product_name": "FTL disk", 00:18:42.053 "block_size": 4096, 00:18:42.053 "num_blocks": 23592960, 00:18:42.053 "uuid": "3af9b670-83ba-4af2-8311-25d469da50d8", 00:18:42.053 "assigned_rate_limits": { 00:18:42.053 "rw_ios_per_sec": 0, 00:18:42.053 "rw_mbytes_per_sec": 0, 00:18:42.053 "r_mbytes_per_sec": 0, 00:18:42.053 "w_mbytes_per_sec": 0 00:18:42.053 }, 00:18:42.053 "claimed": false, 00:18:42.053 "zoned": false, 00:18:42.053 "supported_io_types": { 00:18:42.053 "read": true, 00:18:42.053 "write": true, 00:18:42.053 "unmap": true, 00:18:42.053 "flush": true, 00:18:42.053 "reset": false, 00:18:42.053 "nvme_admin": false, 00:18:42.053 "nvme_io": false, 00:18:42.053 "nvme_io_md": false, 00:18:42.053 "write_zeroes": true, 00:18:42.053 "zcopy": false, 00:18:42.053 "get_zone_info": false, 00:18:42.053 "zone_management": false, 00:18:42.053 "zone_append": false, 00:18:42.053 "compare": false, 00:18:42.053 "compare_and_write": false, 00:18:42.053 "abort": false, 00:18:42.053 "seek_hole": false, 00:18:42.053 "seek_data": false, 00:18:42.053 "copy": false, 00:18:42.053 "nvme_iov_md": false 00:18:42.053 }, 00:18:42.053 "driver_specific": { 00:18:42.053 "ftl": { 00:18:42.053 "base_bdev": "9243ddab-52d2-4e52-82bb-1c6fcb1ef45c", 00:18:42.053 "cache": "nvc0n1p0" 00:18:42.053 } 00:18:42.053 } 00:18:42.053 } 00:18:42.053 ] 00:18:42.053 05:09:01 ftl.ftl_trim -- common/autotest_common.sh@911 -- # return 0 00:18:42.053 05:09:01 ftl.ftl_trim -- ftl/trim.sh@54 -- # echo '{"subsystems": [' 00:18:42.053 05:09:01 ftl.ftl_trim -- ftl/trim.sh@55 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py save_subsystem_config -n bdev 00:18:42.313 05:09:02 ftl.ftl_trim -- ftl/trim.sh@56 -- # echo ']}' 00:18:42.313 05:09:02 ftl.ftl_trim -- ftl/trim.sh@59 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b ftl0 00:18:42.313 05:09:02 ftl.ftl_trim -- ftl/trim.sh@59 -- # bdev_info='[ 00:18:42.313 { 00:18:42.313 "name": "ftl0", 00:18:42.313 "aliases": [ 00:18:42.313 "3af9b670-83ba-4af2-8311-25d469da50d8" 00:18:42.313 ], 00:18:42.313 "product_name": "FTL disk", 00:18:42.313 "block_size": 4096, 00:18:42.313 "num_blocks": 23592960, 00:18:42.313 "uuid": "3af9b670-83ba-4af2-8311-25d469da50d8", 00:18:42.313 "assigned_rate_limits": { 00:18:42.313 "rw_ios_per_sec": 0, 00:18:42.313 "rw_mbytes_per_sec": 0, 00:18:42.313 "r_mbytes_per_sec": 0, 00:18:42.313 "w_mbytes_per_sec": 0 00:18:42.313 }, 00:18:42.313 "claimed": false, 00:18:42.313 "zoned": false, 00:18:42.313 "supported_io_types": { 00:18:42.313 "read": true, 00:18:42.313 "write": true, 00:18:42.313 "unmap": true, 00:18:42.313 "flush": true, 00:18:42.313 "reset": false, 00:18:42.313 "nvme_admin": false, 00:18:42.313 "nvme_io": false, 00:18:42.313 "nvme_io_md": false, 00:18:42.313 "write_zeroes": true, 00:18:42.313 "zcopy": false, 00:18:42.313 "get_zone_info": false, 00:18:42.313 "zone_management": false, 00:18:42.313 "zone_append": false, 00:18:42.313 "compare": false, 00:18:42.313 "compare_and_write": false, 00:18:42.313 "abort": false, 00:18:42.313 "seek_hole": false, 00:18:42.313 "seek_data": false, 00:18:42.313 "copy": false, 00:18:42.313 "nvme_iov_md": false 00:18:42.313 }, 00:18:42.313 "driver_specific": { 00:18:42.313 "ftl": { 00:18:42.313 "base_bdev": "9243ddab-52d2-4e52-82bb-1c6fcb1ef45c", 00:18:42.313 "cache": "nvc0n1p0" 00:18:42.313 } 00:18:42.313 } 00:18:42.313 } 00:18:42.313 ]' 00:18:42.313 05:09:02 ftl.ftl_trim -- ftl/trim.sh@60 -- # jq '.[] .num_blocks' 00:18:42.313 05:09:02 ftl.ftl_trim -- ftl/trim.sh@60 -- # nb=23592960 00:18:42.313 05:09:02 ftl.ftl_trim -- ftl/trim.sh@61 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_unload -b ftl0 00:18:42.573 [2024-12-15 05:09:02.592832] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.573 [2024-12-15 05:09:02.592883] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:18:42.573 [2024-12-15 05:09:02.592898] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:18:42.573 [2024-12-15 05:09:02.592907] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.573 [2024-12-15 05:09:02.592962] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on ftl_core_thread 00:18:42.573 [2024-12-15 05:09:02.593409] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.573 [2024-12-15 05:09:02.593426] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:18:42.573 [2024-12-15 05:09:02.593454] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.434 ms 00:18:42.573 [2024-12-15 05:09:02.593465] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.573 [2024-12-15 05:09:02.594138] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.573 [2024-12-15 05:09:02.594154] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:18:42.573 [2024-12-15 05:09:02.594163] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.632 ms 00:18:42.573 [2024-12-15 05:09:02.594171] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.573 [2024-12-15 05:09:02.597832] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.573 [2024-12-15 05:09:02.597854] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:18:42.573 [2024-12-15 05:09:02.597864] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.628 ms 00:18:42.573 [2024-12-15 05:09:02.597873] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.573 [2024-12-15 05:09:02.604793] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.573 [2024-12-15 05:09:02.604828] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:18:42.573 [2024-12-15 05:09:02.604836] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.870 ms 00:18:42.573 [2024-12-15 05:09:02.604848] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.573 [2024-12-15 05:09:02.606954] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.573 [2024-12-15 05:09:02.606987] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:18:42.573 [2024-12-15 05:09:02.606997] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.004 ms 00:18:42.573 [2024-12-15 05:09:02.607005] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.574 [2024-12-15 05:09:02.611680] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.574 [2024-12-15 05:09:02.611794] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:18:42.574 [2024-12-15 05:09:02.611810] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.624 ms 00:18:42.574 [2024-12-15 05:09:02.611820] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.574 [2024-12-15 05:09:02.612037] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.574 [2024-12-15 05:09:02.612063] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:18:42.574 [2024-12-15 05:09:02.612071] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.162 ms 00:18:42.574 [2024-12-15 05:09:02.612091] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.574 [2024-12-15 05:09:02.613908] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.574 [2024-12-15 05:09:02.613941] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:18:42.574 [2024-12-15 05:09:02.613950] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.784 ms 00:18:42.574 [2024-12-15 05:09:02.613961] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.574 [2024-12-15 05:09:02.615336] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.574 [2024-12-15 05:09:02.615445] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:18:42.574 [2024-12-15 05:09:02.615459] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.330 ms 00:18:42.574 [2024-12-15 05:09:02.615468] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.574 [2024-12-15 05:09:02.616638] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.574 [2024-12-15 05:09:02.616676] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:18:42.574 [2024-12-15 05:09:02.616685] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.127 ms 00:18:42.574 [2024-12-15 05:09:02.616694] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.574 [2024-12-15 05:09:02.617878] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.574 [2024-12-15 05:09:02.617912] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:18:42.574 [2024-12-15 05:09:02.617920] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.072 ms 00:18:42.574 [2024-12-15 05:09:02.617928] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.574 [2024-12-15 05:09:02.617971] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:18:42.574 [2024-12-15 05:09:02.617986] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.617995] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618006] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618014] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618023] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618030] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618039] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618047] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618056] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618064] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618073] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618080] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618088] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618096] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618104] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618112] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618121] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618128] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618140] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618148] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618156] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618164] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618186] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618194] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618203] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618210] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618220] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618227] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618236] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618244] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618252] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618260] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618269] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618276] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618287] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618295] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618305] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618313] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618322] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618329] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618338] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618345] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618355] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618362] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618371] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618378] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618387] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618395] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618403] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618410] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618421] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618428] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618452] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618460] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618468] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618476] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618485] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618506] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618515] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618523] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618532] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:18:42.574 [2024-12-15 05:09:02.618539] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618548] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618555] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618564] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618572] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618583] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618590] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618600] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618608] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618617] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618625] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618634] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618641] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618649] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618657] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618666] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618673] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618682] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618689] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618698] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618705] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618716] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618723] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618732] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618739] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618748] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618755] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618763] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618770] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618779] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618786] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618795] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618803] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618812] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618818] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618828] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618835] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618846] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618853] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:18:42.575 [2024-12-15 05:09:02.618870] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:18:42.575 [2024-12-15 05:09:02.618881] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 3af9b670-83ba-4af2-8311-25d469da50d8 00:18:42.575 [2024-12-15 05:09:02.618890] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:18:42.575 [2024-12-15 05:09:02.618897] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:18:42.575 [2024-12-15 05:09:02.618907] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:18:42.575 [2024-12-15 05:09:02.618914] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:18:42.575 [2024-12-15 05:09:02.618923] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:18:42.575 [2024-12-15 05:09:02.618930] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:18:42.575 [2024-12-15 05:09:02.618938] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:18:42.575 [2024-12-15 05:09:02.618944] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:18:42.575 [2024-12-15 05:09:02.618952] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:18:42.575 [2024-12-15 05:09:02.618959] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.575 [2024-12-15 05:09:02.618968] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:18:42.575 [2024-12-15 05:09:02.618976] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.989 ms 00:18:42.575 [2024-12-15 05:09:02.618986] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.575 [2024-12-15 05:09:02.620492] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.575 [2024-12-15 05:09:02.620510] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:18:42.575 [2024-12-15 05:09:02.620519] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.474 ms 00:18:42.575 [2024-12-15 05:09:02.620528] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.575 [2024-12-15 05:09:02.620621] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:42.575 [2024-12-15 05:09:02.620631] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:18:42.575 [2024-12-15 05:09:02.620639] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.057 ms 00:18:42.575 [2024-12-15 05:09:02.620647] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.575 [2024-12-15 05:09:02.626005] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:42.575 [2024-12-15 05:09:02.626109] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:18:42.575 [2024-12-15 05:09:02.626164] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:42.575 [2024-12-15 05:09:02.626189] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.575 [2024-12-15 05:09:02.626405] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:42.575 [2024-12-15 05:09:02.626463] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:18:42.575 [2024-12-15 05:09:02.626488] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:42.575 [2024-12-15 05:09:02.626588] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.575 [2024-12-15 05:09:02.626667] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:42.575 [2024-12-15 05:09:02.626695] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:18:42.575 [2024-12-15 05:09:02.626715] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:42.575 [2024-12-15 05:09:02.626734] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.575 [2024-12-15 05:09:02.626781] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:42.575 [2024-12-15 05:09:02.626858] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:18:42.575 [2024-12-15 05:09:02.626882] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:42.575 [2024-12-15 05:09:02.626901] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.575 [2024-12-15 05:09:02.635897] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:42.575 [2024-12-15 05:09:02.636033] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:18:42.575 [2024-12-15 05:09:02.636123] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:42.575 [2024-12-15 05:09:02.636147] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.575 [2024-12-15 05:09:02.643721] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:42.575 [2024-12-15 05:09:02.643836] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:18:42.575 [2024-12-15 05:09:02.643883] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:42.575 [2024-12-15 05:09:02.643909] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.575 [2024-12-15 05:09:02.644016] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:42.575 [2024-12-15 05:09:02.644049] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:18:42.575 [2024-12-15 05:09:02.644072] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:42.575 [2024-12-15 05:09:02.644155] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.575 [2024-12-15 05:09:02.644233] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:42.575 [2024-12-15 05:09:02.644255] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:18:42.575 [2024-12-15 05:09:02.644274] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:42.575 [2024-12-15 05:09:02.644321] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.576 [2024-12-15 05:09:02.644429] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:42.576 [2024-12-15 05:09:02.644468] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:18:42.576 [2024-12-15 05:09:02.644566] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:42.576 [2024-12-15 05:09:02.644605] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.576 [2024-12-15 05:09:02.644683] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:42.576 [2024-12-15 05:09:02.644836] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:18:42.576 [2024-12-15 05:09:02.644882] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:42.576 [2024-12-15 05:09:02.644907] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.576 [2024-12-15 05:09:02.644978] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:42.576 [2024-12-15 05:09:02.645002] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:18:42.576 [2024-12-15 05:09:02.645021] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:42.576 [2024-12-15 05:09:02.645068] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.576 [2024-12-15 05:09:02.645137] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:18:42.576 [2024-12-15 05:09:02.645216] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:18:42.576 [2024-12-15 05:09:02.645239] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:18:42.576 [2024-12-15 05:09:02.645258] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:42.576 [2024-12-15 05:09:02.645494] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 52.629 ms, result 0 00:18:42.576 true 00:18:42.576 05:09:02 ftl.ftl_trim -- ftl/trim.sh@63 -- # killprocess 89370 00:18:42.576 05:09:02 ftl.ftl_trim -- common/autotest_common.sh@954 -- # '[' -z 89370 ']' 00:18:42.576 05:09:02 ftl.ftl_trim -- common/autotest_common.sh@958 -- # kill -0 89370 00:18:42.576 05:09:02 ftl.ftl_trim -- common/autotest_common.sh@959 -- # uname 00:18:42.576 05:09:02 ftl.ftl_trim -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:18:42.576 05:09:02 ftl.ftl_trim -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 89370 00:18:42.576 05:09:02 ftl.ftl_trim -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:18:42.576 05:09:02 ftl.ftl_trim -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:18:42.576 05:09:02 ftl.ftl_trim -- common/autotest_common.sh@972 -- # echo 'killing process with pid 89370' 00:18:42.576 killing process with pid 89370 00:18:42.576 05:09:02 ftl.ftl_trim -- common/autotest_common.sh@973 -- # kill 89370 00:18:42.576 05:09:02 ftl.ftl_trim -- common/autotest_common.sh@978 -- # wait 89370 00:18:47.848 05:09:07 ftl.ftl_trim -- ftl/trim.sh@66 -- # dd if=/dev/urandom bs=4K count=65536 00:18:48.109 65536+0 records in 00:18:48.109 65536+0 records out 00:18:48.109 268435456 bytes (268 MB, 256 MiB) copied, 0.801858 s, 335 MB/s 00:18:48.109 05:09:08 ftl.ftl_trim -- ftl/trim.sh@69 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/ftl/random_pattern --ob=ftl0 --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:18:48.370 [2024-12-15 05:09:08.267109] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:18:48.370 [2024-12-15 05:09:08.267231] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid89530 ] 00:18:48.370 [2024-12-15 05:09:08.425482] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:48.370 [2024-12-15 05:09:08.454355] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:18:48.633 [2024-12-15 05:09:08.570996] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:18:48.633 [2024-12-15 05:09:08.571087] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:18:48.633 [2024-12-15 05:09:08.732095] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.633 [2024-12-15 05:09:08.732151] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:18:48.633 [2024-12-15 05:09:08.732166] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:18:48.633 [2024-12-15 05:09:08.732174] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.633 [2024-12-15 05:09:08.734760] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.633 [2024-12-15 05:09:08.734809] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:18:48.633 [2024-12-15 05:09:08.734820] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.565 ms 00:18:48.633 [2024-12-15 05:09:08.734829] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.633 [2024-12-15 05:09:08.734926] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:18:48.633 [2024-12-15 05:09:08.735196] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:18:48.633 [2024-12-15 05:09:08.735215] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.633 [2024-12-15 05:09:08.735224] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:18:48.633 [2024-12-15 05:09:08.735237] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.297 ms 00:18:48.633 [2024-12-15 05:09:08.735246] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.633 [2024-12-15 05:09:08.737373] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 0, shm_clean 0 00:18:48.633 [2024-12-15 05:09:08.740819] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.633 [2024-12-15 05:09:08.740871] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:18:48.633 [2024-12-15 05:09:08.740888] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.450 ms 00:18:48.633 [2024-12-15 05:09:08.740897] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.633 [2024-12-15 05:09:08.740982] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.633 [2024-12-15 05:09:08.740993] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:18:48.633 [2024-12-15 05:09:08.741002] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.027 ms 00:18:48.633 [2024-12-15 05:09:08.741010] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.633 [2024-12-15 05:09:08.748971] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.633 [2024-12-15 05:09:08.749017] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:18:48.633 [2024-12-15 05:09:08.749031] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 7.914 ms 00:18:48.633 [2024-12-15 05:09:08.749039] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.633 [2024-12-15 05:09:08.749180] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.633 [2024-12-15 05:09:08.749193] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:18:48.633 [2024-12-15 05:09:08.749202] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.074 ms 00:18:48.633 [2024-12-15 05:09:08.749213] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.633 [2024-12-15 05:09:08.749238] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.633 [2024-12-15 05:09:08.749249] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:18:48.633 [2024-12-15 05:09:08.749258] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.007 ms 00:18:48.633 [2024-12-15 05:09:08.749265] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.633 [2024-12-15 05:09:08.749289] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on ftl_core_thread 00:18:48.633 [2024-12-15 05:09:08.751323] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.633 [2024-12-15 05:09:08.751366] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:18:48.633 [2024-12-15 05:09:08.751376] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.039 ms 00:18:48.633 [2024-12-15 05:09:08.751389] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.633 [2024-12-15 05:09:08.751454] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.633 [2024-12-15 05:09:08.751468] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:18:48.633 [2024-12-15 05:09:08.751482] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.035 ms 00:18:48.633 [2024-12-15 05:09:08.751494] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.633 [2024-12-15 05:09:08.751512] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:18:48.633 [2024-12-15 05:09:08.751535] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:18:48.633 [2024-12-15 05:09:08.751575] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:18:48.633 [2024-12-15 05:09:08.751594] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:18:48.633 [2024-12-15 05:09:08.751703] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:18:48.633 [2024-12-15 05:09:08.751714] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:18:48.633 [2024-12-15 05:09:08.751725] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:18:48.633 [2024-12-15 05:09:08.751736] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:18:48.633 [2024-12-15 05:09:08.751746] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:18:48.633 [2024-12-15 05:09:08.751755] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 23592960 00:18:48.633 [2024-12-15 05:09:08.751763] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:18:48.633 [2024-12-15 05:09:08.751771] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:18:48.633 [2024-12-15 05:09:08.751785] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:18:48.633 [2024-12-15 05:09:08.751796] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.633 [2024-12-15 05:09:08.751807] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:18:48.633 [2024-12-15 05:09:08.751815] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.284 ms 00:18:48.633 [2024-12-15 05:09:08.751823] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.633 [2024-12-15 05:09:08.751912] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.633 [2024-12-15 05:09:08.751921] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:18:48.633 [2024-12-15 05:09:08.751932] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.069 ms 00:18:48.633 [2024-12-15 05:09:08.751940] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.633 [2024-12-15 05:09:08.752040] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:18:48.633 [2024-12-15 05:09:08.752078] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:18:48.633 [2024-12-15 05:09:08.752091] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:18:48.633 [2024-12-15 05:09:08.752100] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:48.633 [2024-12-15 05:09:08.752112] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:18:48.633 [2024-12-15 05:09:08.752120] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:18:48.633 [2024-12-15 05:09:08.752129] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 90.00 MiB 00:18:48.633 [2024-12-15 05:09:08.752140] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:18:48.633 [2024-12-15 05:09:08.752149] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 90.12 MiB 00:18:48.634 [2024-12-15 05:09:08.752157] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:18:48.634 [2024-12-15 05:09:08.752166] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:18:48.634 [2024-12-15 05:09:08.752174] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 90.62 MiB 00:18:48.634 [2024-12-15 05:09:08.752181] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:18:48.634 [2024-12-15 05:09:08.752189] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:18:48.634 [2024-12-15 05:09:08.752198] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.88 MiB 00:18:48.634 [2024-12-15 05:09:08.752205] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:48.634 [2024-12-15 05:09:08.752213] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:18:48.634 [2024-12-15 05:09:08.752221] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 124.00 MiB 00:18:48.634 [2024-12-15 05:09:08.752229] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:48.634 [2024-12-15 05:09:08.752238] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:18:48.634 [2024-12-15 05:09:08.752246] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 91.12 MiB 00:18:48.634 [2024-12-15 05:09:08.752254] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:18:48.634 [2024-12-15 05:09:08.752264] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:18:48.634 [2024-12-15 05:09:08.752278] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 99.12 MiB 00:18:48.634 [2024-12-15 05:09:08.752286] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:18:48.634 [2024-12-15 05:09:08.752293] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:18:48.634 [2024-12-15 05:09:08.752301] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 107.12 MiB 00:18:48.634 [2024-12-15 05:09:08.752309] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:18:48.634 [2024-12-15 05:09:08.752317] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:18:48.634 [2024-12-15 05:09:08.752325] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 115.12 MiB 00:18:48.634 [2024-12-15 05:09:08.752333] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:18:48.634 [2024-12-15 05:09:08.752341] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:18:48.634 [2024-12-15 05:09:08.752348] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.12 MiB 00:18:48.634 [2024-12-15 05:09:08.752354] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:18:48.634 [2024-12-15 05:09:08.752361] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:18:48.634 [2024-12-15 05:09:08.752367] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.38 MiB 00:18:48.634 [2024-12-15 05:09:08.752374] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:18:48.634 [2024-12-15 05:09:08.752381] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:18:48.634 [2024-12-15 05:09:08.752388] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.62 MiB 00:18:48.634 [2024-12-15 05:09:08.752397] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:48.634 [2024-12-15 05:09:08.752403] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:18:48.634 [2024-12-15 05:09:08.752410] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.75 MiB 00:18:48.634 [2024-12-15 05:09:08.752417] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:48.634 [2024-12-15 05:09:08.752424] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:18:48.634 [2024-12-15 05:09:08.752703] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:18:48.634 [2024-12-15 05:09:08.752747] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:18:48.634 [2024-12-15 05:09:08.752777] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:18:48.634 [2024-12-15 05:09:08.752800] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:18:48.634 [2024-12-15 05:09:08.752819] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:18:48.634 [2024-12-15 05:09:08.752837] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:18:48.634 [2024-12-15 05:09:08.752856] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:18:48.634 [2024-12-15 05:09:08.752874] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:18:48.634 [2024-12-15 05:09:08.752893] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:18:48.634 [2024-12-15 05:09:08.752913] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:18:48.634 [2024-12-15 05:09:08.752947] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:18:48.634 [2024-12-15 05:09:08.752980] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5a00 00:18:48.634 [2024-12-15 05:09:08.753077] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5a20 blk_sz:0x80 00:18:48.634 [2024-12-15 05:09:08.753106] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x5aa0 blk_sz:0x80 00:18:48.634 [2024-12-15 05:09:08.753135] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5b20 blk_sz:0x800 00:18:48.634 [2024-12-15 05:09:08.753692] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x6320 blk_sz:0x800 00:18:48.634 [2024-12-15 05:09:08.753760] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6b20 blk_sz:0x800 00:18:48.634 [2024-12-15 05:09:08.753792] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x7320 blk_sz:0x800 00:18:48.634 [2024-12-15 05:09:08.753821] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7b20 blk_sz:0x40 00:18:48.634 [2024-12-15 05:09:08.753899] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7b60 blk_sz:0x40 00:18:48.634 [2024-12-15 05:09:08.753909] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x7ba0 blk_sz:0x20 00:18:48.634 [2024-12-15 05:09:08.753917] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x7bc0 blk_sz:0x20 00:18:48.634 [2024-12-15 05:09:08.753925] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x7be0 blk_sz:0x20 00:18:48.634 [2024-12-15 05:09:08.753933] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7c00 blk_sz:0x20 00:18:48.634 [2024-12-15 05:09:08.753940] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7c20 blk_sz:0x13b6e0 00:18:48.634 [2024-12-15 05:09:08.753948] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:18:48.634 [2024-12-15 05:09:08.753965] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:18:48.634 [2024-12-15 05:09:08.753977] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:18:48.634 [2024-12-15 05:09:08.753984] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:18:48.634 [2024-12-15 05:09:08.753992] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:18:48.634 [2024-12-15 05:09:08.754000] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:18:48.634 [2024-12-15 05:09:08.754011] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.634 [2024-12-15 05:09:08.754021] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:18:48.634 [2024-12-15 05:09:08.754031] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.039 ms 00:18:48.634 [2024-12-15 05:09:08.754039] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.634 [2024-12-15 05:09:08.767865] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.634 [2024-12-15 05:09:08.767915] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:18:48.634 [2024-12-15 05:09:08.767928] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 13.726 ms 00:18:48.634 [2024-12-15 05:09:08.767945] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.634 [2024-12-15 05:09:08.768098] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.634 [2024-12-15 05:09:08.768115] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:18:48.634 [2024-12-15 05:09:08.768128] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.082 ms 00:18:48.634 [2024-12-15 05:09:08.768136] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.793818] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.793892] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:18:48.897 [2024-12-15 05:09:08.793915] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 25.656 ms 00:18:48.897 [2024-12-15 05:09:08.793930] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.794088] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.794111] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:18:48.897 [2024-12-15 05:09:08.794129] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.006 ms 00:18:48.897 [2024-12-15 05:09:08.794143] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.794772] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.794818] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:18:48.897 [2024-12-15 05:09:08.794851] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.589 ms 00:18:48.897 [2024-12-15 05:09:08.794868] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.795119] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.795154] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:18:48.897 [2024-12-15 05:09:08.795177] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.203 ms 00:18:48.897 [2024-12-15 05:09:08.795192] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.803775] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.803818] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:18:48.897 [2024-12-15 05:09:08.803828] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.544 ms 00:18:48.897 [2024-12-15 05:09:08.803842] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.807707] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 0, empty chunks = 4 00:18:48.897 [2024-12-15 05:09:08.807758] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:18:48.897 [2024-12-15 05:09:08.807771] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.807779] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:18:48.897 [2024-12-15 05:09:08.807788] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.826 ms 00:18:48.897 [2024-12-15 05:09:08.807797] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.823519] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.823567] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:18:48.897 [2024-12-15 05:09:08.823580] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 15.657 ms 00:18:48.897 [2024-12-15 05:09:08.823590] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.826597] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.826764] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:18:48.897 [2024-12-15 05:09:08.826783] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.887 ms 00:18:48.897 [2024-12-15 05:09:08.826790] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.829466] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.829516] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:18:48.897 [2024-12-15 05:09:08.829527] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.630 ms 00:18:48.897 [2024-12-15 05:09:08.829535] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.829877] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.829899] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:18:48.897 [2024-12-15 05:09:08.829908] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.264 ms 00:18:48.897 [2024-12-15 05:09:08.829917] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.854887] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.854947] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:18:48.897 [2024-12-15 05:09:08.854961] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 24.946 ms 00:18:48.897 [2024-12-15 05:09:08.854970] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.863181] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 59 (of 60) MiB 00:18:48.897 [2024-12-15 05:09:08.882165] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.882216] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:18:48.897 [2024-12-15 05:09:08.882230] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 27.100 ms 00:18:48.897 [2024-12-15 05:09:08.882246] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.882335] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.882347] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:18:48.897 [2024-12-15 05:09:08.882358] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.015 ms 00:18:48.897 [2024-12-15 05:09:08.882370] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.882426] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.882474] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:18:48.897 [2024-12-15 05:09:08.882484] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.034 ms 00:18:48.897 [2024-12-15 05:09:08.882496] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.882527] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.882537] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:18:48.897 [2024-12-15 05:09:08.882546] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.011 ms 00:18:48.897 [2024-12-15 05:09:08.882554] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.882595] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:18:48.897 [2024-12-15 05:09:08.882605] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.882614] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:18:48.897 [2024-12-15 05:09:08.882623] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.011 ms 00:18:48.897 [2024-12-15 05:09:08.882631] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.888135] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.888182] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:18:48.897 [2024-12-15 05:09:08.888195] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.475 ms 00:18:48.897 [2024-12-15 05:09:08.888203] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.888303] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:18:48.897 [2024-12-15 05:09:08.888320] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:18:48.897 [2024-12-15 05:09:08.888330] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.039 ms 00:18:48.897 [2024-12-15 05:09:08.888339] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:18:48.897 [2024-12-15 05:09:08.889391] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:18:48.898 [2024-12-15 05:09:08.890742] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 157.003 ms, result 0 00:18:48.898 [2024-12-15 05:09:08.891705] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:18:48.898 [2024-12-15 05:09:08.899369] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:18:49.843  [2024-12-15T05:09:10.928Z] Copying: 18/256 [MB] (18 MBps) [2024-12-15T05:09:12.331Z] Copying: 33/256 [MB] (15 MBps) [2024-12-15T05:09:12.905Z] Copying: 49/256 [MB] (16 MBps) [2024-12-15T05:09:14.295Z] Copying: 75/256 [MB] (25 MBps) [2024-12-15T05:09:15.240Z] Copying: 90/256 [MB] (15 MBps) [2024-12-15T05:09:16.183Z] Copying: 106/256 [MB] (15 MBps) [2024-12-15T05:09:17.127Z] Copying: 117/256 [MB] (10 MBps) [2024-12-15T05:09:18.073Z] Copying: 128/256 [MB] (11 MBps) [2024-12-15T05:09:19.016Z] Copying: 148/256 [MB] (19 MBps) [2024-12-15T05:09:19.960Z] Copying: 163/256 [MB] (15 MBps) [2024-12-15T05:09:20.904Z] Copying: 173/256 [MB] (10 MBps) [2024-12-15T05:09:22.292Z] Copying: 184/256 [MB] (10 MBps) [2024-12-15T05:09:23.237Z] Copying: 194/256 [MB] (10 MBps) [2024-12-15T05:09:24.182Z] Copying: 208860/262144 [kB] (10016 kBps) [2024-12-15T05:09:25.126Z] Copying: 219080/262144 [kB] (10220 kBps) [2024-12-15T05:09:26.069Z] Copying: 224/256 [MB] (10 MBps) [2024-12-15T05:09:26.330Z] Copying: 249/256 [MB] (24 MBps) [2024-12-15T05:09:26.330Z] Copying: 256/256 [MB] (average 14 MBps)[2024-12-15 05:09:26.288113] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:19:06.190 [2024-12-15 05:09:26.290205] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.191 [2024-12-15 05:09:26.290380] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:19:06.191 [2024-12-15 05:09:26.290479] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:19:06.191 [2024-12-15 05:09:26.290506] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.191 [2024-12-15 05:09:26.290549] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on ftl_core_thread 00:19:06.191 [2024-12-15 05:09:26.291306] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.191 [2024-12-15 05:09:26.291477] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:19:06.191 [2024-12-15 05:09:26.291556] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.716 ms 00:19:06.191 [2024-12-15 05:09:26.291584] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.191 [2024-12-15 05:09:26.294512] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.191 [2024-12-15 05:09:26.294676] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:19:06.191 [2024-12-15 05:09:26.294745] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.877 ms 00:19:06.191 [2024-12-15 05:09:26.294778] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.191 [2024-12-15 05:09:26.302586] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.191 [2024-12-15 05:09:26.302801] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:19:06.191 [2024-12-15 05:09:26.302872] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 7.770 ms 00:19:06.191 [2024-12-15 05:09:26.302897] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.191 [2024-12-15 05:09:26.309859] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.191 [2024-12-15 05:09:26.310022] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:19:06.191 [2024-12-15 05:09:26.310053] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.895 ms 00:19:06.191 [2024-12-15 05:09:26.310065] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.191 [2024-12-15 05:09:26.313188] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.191 [2024-12-15 05:09:26.313347] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:19:06.191 [2024-12-15 05:09:26.313364] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.076 ms 00:19:06.191 [2024-12-15 05:09:26.313372] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.191 [2024-12-15 05:09:26.319073] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.191 [2024-12-15 05:09:26.319135] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:19:06.191 [2024-12-15 05:09:26.319147] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.648 ms 00:19:06.191 [2024-12-15 05:09:26.319156] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.191 [2024-12-15 05:09:26.319297] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.191 [2024-12-15 05:09:26.319310] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:19:06.191 [2024-12-15 05:09:26.319320] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.092 ms 00:19:06.191 [2024-12-15 05:09:26.319332] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.191 [2024-12-15 05:09:26.323130] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.191 [2024-12-15 05:09:26.323298] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:19:06.191 [2024-12-15 05:09:26.323317] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.778 ms 00:19:06.191 [2024-12-15 05:09:26.323325] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.191 [2024-12-15 05:09:26.326175] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.191 [2024-12-15 05:09:26.326226] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:19:06.191 [2024-12-15 05:09:26.326236] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.809 ms 00:19:06.191 [2024-12-15 05:09:26.326242] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.191 [2024-12-15 05:09:26.328281] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.454 [2024-12-15 05:09:26.328460] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:19:06.454 [2024-12-15 05:09:26.328478] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.993 ms 00:19:06.454 [2024-12-15 05:09:26.328487] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.454 [2024-12-15 05:09:26.330805] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.454 [2024-12-15 05:09:26.330855] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:19:06.454 [2024-12-15 05:09:26.330866] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.246 ms 00:19:06.454 [2024-12-15 05:09:26.330873] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.454 [2024-12-15 05:09:26.330914] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:19:06.454 [2024-12-15 05:09:26.330929] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.330940] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.330948] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.330955] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.330963] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.330970] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.330978] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.330986] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.330993] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331001] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331008] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331016] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331023] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331031] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331038] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331045] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331052] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331059] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331066] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331074] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331081] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331088] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331096] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331103] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331110] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331117] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331127] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331134] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331142] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331150] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331158] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331166] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331174] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331181] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331188] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331196] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331203] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331211] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331218] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331226] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331234] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331241] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331248] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331256] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331263] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331270] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331279] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331286] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331294] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331301] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331309] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331317] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331324] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331331] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331339] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331347] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331354] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331361] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331368] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331375] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331383] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331390] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331402] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331410] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331417] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331425] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331454] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331461] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331469] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331477] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331485] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331492] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331499] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:19:06.454 [2024-12-15 05:09:26.331507] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331514] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331522] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331529] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331537] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331544] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331551] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331559] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331566] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331574] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331582] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331590] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331598] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331607] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331616] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331623] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331631] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331638] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331646] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331654] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331661] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331671] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331678] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331686] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331703] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331711] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331718] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:19:06.455 [2024-12-15 05:09:26.331734] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:19:06.455 [2024-12-15 05:09:26.331743] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 3af9b670-83ba-4af2-8311-25d469da50d8 00:19:06.455 [2024-12-15 05:09:26.331752] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:19:06.455 [2024-12-15 05:09:26.331760] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:19:06.455 [2024-12-15 05:09:26.331766] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:19:06.455 [2024-12-15 05:09:26.331775] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:19:06.455 [2024-12-15 05:09:26.331782] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:19:06.455 [2024-12-15 05:09:26.331790] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:19:06.455 [2024-12-15 05:09:26.331802] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:19:06.455 [2024-12-15 05:09:26.331808] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:19:06.455 [2024-12-15 05:09:26.331815] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:19:06.455 [2024-12-15 05:09:26.331822] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.455 [2024-12-15 05:09:26.331844] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:19:06.455 [2024-12-15 05:09:26.331854] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.909 ms 00:19:06.455 [2024-12-15 05:09:26.331866] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.334279] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.455 [2024-12-15 05:09:26.334472] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:19:06.455 [2024-12-15 05:09:26.334491] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.393 ms 00:19:06.455 [2024-12-15 05:09:26.334499] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.334650] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:06.455 [2024-12-15 05:09:26.334660] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:19:06.455 [2024-12-15 05:09:26.334670] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.104 ms 00:19:06.455 [2024-12-15 05:09:26.334677] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.342895] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:06.455 [2024-12-15 05:09:26.342952] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:19:06.455 [2024-12-15 05:09:26.342963] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:06.455 [2024-12-15 05:09:26.342972] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.343053] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:06.455 [2024-12-15 05:09:26.343061] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:19:06.455 [2024-12-15 05:09:26.343069] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:06.455 [2024-12-15 05:09:26.343081] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.343133] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:06.455 [2024-12-15 05:09:26.343144] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:19:06.455 [2024-12-15 05:09:26.343151] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:06.455 [2024-12-15 05:09:26.343158] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.343178] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:06.455 [2024-12-15 05:09:26.343186] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:19:06.455 [2024-12-15 05:09:26.343194] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:06.455 [2024-12-15 05:09:26.343201] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.358395] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:06.455 [2024-12-15 05:09:26.358641] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:19:06.455 [2024-12-15 05:09:26.358662] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:06.455 [2024-12-15 05:09:26.358681] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.369948] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:06.455 [2024-12-15 05:09:26.370133] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:19:06.455 [2024-12-15 05:09:26.370150] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:06.455 [2024-12-15 05:09:26.370159] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.370224] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:06.455 [2024-12-15 05:09:26.370233] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:19:06.455 [2024-12-15 05:09:26.370242] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:06.455 [2024-12-15 05:09:26.370250] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.370283] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:06.455 [2024-12-15 05:09:26.370296] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:19:06.455 [2024-12-15 05:09:26.370305] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:06.455 [2024-12-15 05:09:26.370314] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.370397] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:06.455 [2024-12-15 05:09:26.370408] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:19:06.455 [2024-12-15 05:09:26.370417] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:06.455 [2024-12-15 05:09:26.370424] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.370532] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:06.455 [2024-12-15 05:09:26.370543] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:19:06.455 [2024-12-15 05:09:26.370555] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:06.455 [2024-12-15 05:09:26.370563] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.370612] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:06.455 [2024-12-15 05:09:26.370622] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:19:06.455 [2024-12-15 05:09:26.370630] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:06.455 [2024-12-15 05:09:26.370638] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.370691] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:06.455 [2024-12-15 05:09:26.370705] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:19:06.455 [2024-12-15 05:09:26.370715] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:06.455 [2024-12-15 05:09:26.370724] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:06.455 [2024-12-15 05:09:26.370875] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 80.667 ms, result 0 00:19:06.717 00:19:06.717 00:19:06.717 05:09:26 ftl.ftl_trim -- ftl/trim.sh@72 -- # svcpid=89728 00:19:06.717 05:09:26 ftl.ftl_trim -- ftl/trim.sh@73 -- # waitforlisten 89728 00:19:06.717 05:09:26 ftl.ftl_trim -- common/autotest_common.sh@835 -- # '[' -z 89728 ']' 00:19:06.717 05:09:26 ftl.ftl_trim -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:19:06.717 05:09:26 ftl.ftl_trim -- ftl/trim.sh@71 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -L ftl_init 00:19:06.717 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:19:06.717 05:09:26 ftl.ftl_trim -- common/autotest_common.sh@840 -- # local max_retries=100 00:19:06.717 05:09:26 ftl.ftl_trim -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:19:06.717 05:09:26 ftl.ftl_trim -- common/autotest_common.sh@844 -- # xtrace_disable 00:19:06.717 05:09:26 ftl.ftl_trim -- common/autotest_common.sh@10 -- # set +x 00:19:06.717 [2024-12-15 05:09:26.778331] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:19:06.717 [2024-12-15 05:09:26.778724] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid89728 ] 00:19:06.977 [2024-12-15 05:09:26.947548] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:06.977 [2024-12-15 05:09:26.977252] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:19:07.549 05:09:27 ftl.ftl_trim -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:19:07.549 05:09:27 ftl.ftl_trim -- common/autotest_common.sh@868 -- # return 0 00:19:07.549 05:09:27 ftl.ftl_trim -- ftl/trim.sh@75 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py load_config 00:19:07.811 [2024-12-15 05:09:27.843505] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:19:07.811 [2024-12-15 05:09:27.843591] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:19:08.074 [2024-12-15 05:09:28.017356] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.074 [2024-12-15 05:09:28.017423] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:19:08.074 [2024-12-15 05:09:28.017452] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:19:08.074 [2024-12-15 05:09:28.017463] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.074 [2024-12-15 05:09:28.020108] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.074 [2024-12-15 05:09:28.020164] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:19:08.074 [2024-12-15 05:09:28.020176] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.623 ms 00:19:08.074 [2024-12-15 05:09:28.020186] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.074 [2024-12-15 05:09:28.020305] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:19:08.074 [2024-12-15 05:09:28.020601] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:19:08.074 [2024-12-15 05:09:28.020618] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.074 [2024-12-15 05:09:28.020630] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:19:08.074 [2024-12-15 05:09:28.020641] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.324 ms 00:19:08.074 [2024-12-15 05:09:28.020657] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.074 [2024-12-15 05:09:28.022593] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 0, shm_clean 0 00:19:08.074 [2024-12-15 05:09:28.026672] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.074 [2024-12-15 05:09:28.026727] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:19:08.074 [2024-12-15 05:09:28.026741] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.077 ms 00:19:08.074 [2024-12-15 05:09:28.026749] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.074 [2024-12-15 05:09:28.026834] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.074 [2024-12-15 05:09:28.026845] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:19:08.074 [2024-12-15 05:09:28.026859] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.028 ms 00:19:08.074 [2024-12-15 05:09:28.026867] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.074 [2024-12-15 05:09:28.035190] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.074 [2024-12-15 05:09:28.035235] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:19:08.074 [2024-12-15 05:09:28.035250] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.265 ms 00:19:08.074 [2024-12-15 05:09:28.035258] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.074 [2024-12-15 05:09:28.035374] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.074 [2024-12-15 05:09:28.035384] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:19:08.074 [2024-12-15 05:09:28.035395] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.054 ms 00:19:08.074 [2024-12-15 05:09:28.035406] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.074 [2024-12-15 05:09:28.035473] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.074 [2024-12-15 05:09:28.035486] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:19:08.074 [2024-12-15 05:09:28.035497] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.045 ms 00:19:08.074 [2024-12-15 05:09:28.035504] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.074 [2024-12-15 05:09:28.035536] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on ftl_core_thread 00:19:08.074 [2024-12-15 05:09:28.037624] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.074 [2024-12-15 05:09:28.037806] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:19:08.074 [2024-12-15 05:09:28.037827] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.097 ms 00:19:08.074 [2024-12-15 05:09:28.037837] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.074 [2024-12-15 05:09:28.037885] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.074 [2024-12-15 05:09:28.037896] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:19:08.074 [2024-12-15 05:09:28.037904] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.015 ms 00:19:08.074 [2024-12-15 05:09:28.037914] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.074 [2024-12-15 05:09:28.037936] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:19:08.074 [2024-12-15 05:09:28.037961] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:19:08.074 [2024-12-15 05:09:28.038004] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:19:08.074 [2024-12-15 05:09:28.038025] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:19:08.074 [2024-12-15 05:09:28.038136] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:19:08.074 [2024-12-15 05:09:28.038152] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:19:08.074 [2024-12-15 05:09:28.038163] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:19:08.074 [2024-12-15 05:09:28.038176] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:19:08.074 [2024-12-15 05:09:28.038189] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:19:08.074 [2024-12-15 05:09:28.038204] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 23592960 00:19:08.074 [2024-12-15 05:09:28.038212] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:19:08.074 [2024-12-15 05:09:28.038222] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:19:08.074 [2024-12-15 05:09:28.038232] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:19:08.074 [2024-12-15 05:09:28.038242] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.074 [2024-12-15 05:09:28.038249] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:19:08.074 [2024-12-15 05:09:28.038258] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.306 ms 00:19:08.074 [2024-12-15 05:09:28.038267] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.074 [2024-12-15 05:09:28.038356] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.074 [2024-12-15 05:09:28.038366] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:19:08.074 [2024-12-15 05:09:28.038377] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.069 ms 00:19:08.074 [2024-12-15 05:09:28.038385] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.074 [2024-12-15 05:09:28.038514] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:19:08.074 [2024-12-15 05:09:28.038526] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:19:08.074 [2024-12-15 05:09:28.038538] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:19:08.074 [2024-12-15 05:09:28.038547] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:08.074 [2024-12-15 05:09:28.038562] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:19:08.074 [2024-12-15 05:09:28.038576] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:19:08.074 [2024-12-15 05:09:28.038587] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 90.00 MiB 00:19:08.074 [2024-12-15 05:09:28.038595] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:19:08.074 [2024-12-15 05:09:28.038606] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 90.12 MiB 00:19:08.074 [2024-12-15 05:09:28.038614] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:19:08.074 [2024-12-15 05:09:28.038624] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:19:08.074 [2024-12-15 05:09:28.038633] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 90.62 MiB 00:19:08.074 [2024-12-15 05:09:28.038643] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:19:08.074 [2024-12-15 05:09:28.038651] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:19:08.074 [2024-12-15 05:09:28.038660] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.88 MiB 00:19:08.074 [2024-12-15 05:09:28.038669] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:08.074 [2024-12-15 05:09:28.038679] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:19:08.074 [2024-12-15 05:09:28.038691] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 124.00 MiB 00:19:08.074 [2024-12-15 05:09:28.038701] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:08.074 [2024-12-15 05:09:28.038709] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:19:08.074 [2024-12-15 05:09:28.038721] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 91.12 MiB 00:19:08.074 [2024-12-15 05:09:28.038730] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:08.074 [2024-12-15 05:09:28.038740] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:19:08.074 [2024-12-15 05:09:28.038747] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 99.12 MiB 00:19:08.074 [2024-12-15 05:09:28.038758] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:08.074 [2024-12-15 05:09:28.038766] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:19:08.074 [2024-12-15 05:09:28.038776] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 107.12 MiB 00:19:08.074 [2024-12-15 05:09:28.038783] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:08.074 [2024-12-15 05:09:28.038794] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:19:08.074 [2024-12-15 05:09:28.038800] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 115.12 MiB 00:19:08.074 [2024-12-15 05:09:28.038809] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:08.074 [2024-12-15 05:09:28.038815] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:19:08.074 [2024-12-15 05:09:28.038824] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.12 MiB 00:19:08.075 [2024-12-15 05:09:28.038831] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:19:08.075 [2024-12-15 05:09:28.038839] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:19:08.075 [2024-12-15 05:09:28.038845] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.38 MiB 00:19:08.075 [2024-12-15 05:09:28.038856] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:19:08.075 [2024-12-15 05:09:28.038862] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:19:08.075 [2024-12-15 05:09:28.038871] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.62 MiB 00:19:08.075 [2024-12-15 05:09:28.038879] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:08.075 [2024-12-15 05:09:28.038888] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:19:08.075 [2024-12-15 05:09:28.038894] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.75 MiB 00:19:08.075 [2024-12-15 05:09:28.038903] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:08.075 [2024-12-15 05:09:28.038909] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:19:08.075 [2024-12-15 05:09:28.038919] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:19:08.075 [2024-12-15 05:09:28.038926] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:19:08.075 [2024-12-15 05:09:28.038935] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:08.075 [2024-12-15 05:09:28.038943] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:19:08.075 [2024-12-15 05:09:28.038952] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:19:08.075 [2024-12-15 05:09:28.038959] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:19:08.075 [2024-12-15 05:09:28.038969] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:19:08.075 [2024-12-15 05:09:28.038975] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:19:08.075 [2024-12-15 05:09:28.038987] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:19:08.075 [2024-12-15 05:09:28.038996] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:19:08.075 [2024-12-15 05:09:28.039008] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:19:08.075 [2024-12-15 05:09:28.039019] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5a00 00:19:08.075 [2024-12-15 05:09:28.039029] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5a20 blk_sz:0x80 00:19:08.075 [2024-12-15 05:09:28.039036] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x5aa0 blk_sz:0x80 00:19:08.075 [2024-12-15 05:09:28.039045] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5b20 blk_sz:0x800 00:19:08.075 [2024-12-15 05:09:28.039052] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x6320 blk_sz:0x800 00:19:08.075 [2024-12-15 05:09:28.039062] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6b20 blk_sz:0x800 00:19:08.075 [2024-12-15 05:09:28.039068] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x7320 blk_sz:0x800 00:19:08.075 [2024-12-15 05:09:28.039078] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7b20 blk_sz:0x40 00:19:08.075 [2024-12-15 05:09:28.039085] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7b60 blk_sz:0x40 00:19:08.075 [2024-12-15 05:09:28.039094] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x7ba0 blk_sz:0x20 00:19:08.075 [2024-12-15 05:09:28.039101] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x7bc0 blk_sz:0x20 00:19:08.075 [2024-12-15 05:09:28.039110] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x7be0 blk_sz:0x20 00:19:08.075 [2024-12-15 05:09:28.039118] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7c00 blk_sz:0x20 00:19:08.075 [2024-12-15 05:09:28.039129] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7c20 blk_sz:0x13b6e0 00:19:08.075 [2024-12-15 05:09:28.039136] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:19:08.075 [2024-12-15 05:09:28.039150] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:19:08.075 [2024-12-15 05:09:28.039158] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:19:08.075 [2024-12-15 05:09:28.039167] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:19:08.075 [2024-12-15 05:09:28.039174] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:19:08.075 [2024-12-15 05:09:28.039183] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:19:08.075 [2024-12-15 05:09:28.039191] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.039206] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:19:08.075 [2024-12-15 05:09:28.039214] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.770 ms 00:19:08.075 [2024-12-15 05:09:28.039223] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.075 [2024-12-15 05:09:28.055012] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.055060] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:19:08.075 [2024-12-15 05:09:28.055073] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 15.730 ms 00:19:08.075 [2024-12-15 05:09:28.055085] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.075 [2024-12-15 05:09:28.055227] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.055244] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:19:08.075 [2024-12-15 05:09:28.055253] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.066 ms 00:19:08.075 [2024-12-15 05:09:28.055263] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.075 [2024-12-15 05:09:28.068749] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.068947] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:19:08.075 [2024-12-15 05:09:28.068967] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 13.464 ms 00:19:08.075 [2024-12-15 05:09:28.068980] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.075 [2024-12-15 05:09:28.069053] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.069066] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:19:08.075 [2024-12-15 05:09:28.069075] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:19:08.075 [2024-12-15 05:09:28.069085] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.075 [2024-12-15 05:09:28.069648] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.069675] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:19:08.075 [2024-12-15 05:09:28.069686] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.540 ms 00:19:08.075 [2024-12-15 05:09:28.069697] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.075 [2024-12-15 05:09:28.069855] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.069870] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:19:08.075 [2024-12-15 05:09:28.069879] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.127 ms 00:19:08.075 [2024-12-15 05:09:28.069890] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.075 [2024-12-15 05:09:28.078527] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.078577] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:19:08.075 [2024-12-15 05:09:28.078588] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.613 ms 00:19:08.075 [2024-12-15 05:09:28.078599] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.075 [2024-12-15 05:09:28.091520] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 1, empty chunks = 3 00:19:08.075 [2024-12-15 05:09:28.091587] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:19:08.075 [2024-12-15 05:09:28.091605] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.091619] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:19:08.075 [2024-12-15 05:09:28.091633] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 12.901 ms 00:19:08.075 [2024-12-15 05:09:28.091646] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.075 [2024-12-15 05:09:28.109501] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.109563] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:19:08.075 [2024-12-15 05:09:28.109577] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 17.782 ms 00:19:08.075 [2024-12-15 05:09:28.109590] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.075 [2024-12-15 05:09:28.112791] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.112849] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:19:08.075 [2024-12-15 05:09:28.112860] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.082 ms 00:19:08.075 [2024-12-15 05:09:28.112870] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.075 [2024-12-15 05:09:28.115645] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.115699] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:19:08.075 [2024-12-15 05:09:28.115709] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.723 ms 00:19:08.075 [2024-12-15 05:09:28.115718] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.075 [2024-12-15 05:09:28.116091] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.116107] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:19:08.075 [2024-12-15 05:09:28.116116] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.290 ms 00:19:08.075 [2024-12-15 05:09:28.116126] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.075 [2024-12-15 05:09:28.141631] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.141696] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:19:08.075 [2024-12-15 05:09:28.141710] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 25.478 ms 00:19:08.075 [2024-12-15 05:09:28.141723] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.075 [2024-12-15 05:09:28.149935] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 59 (of 60) MiB 00:19:08.075 [2024-12-15 05:09:28.168837] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.075 [2024-12-15 05:09:28.168887] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:19:08.076 [2024-12-15 05:09:28.168902] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 27.004 ms 00:19:08.076 [2024-12-15 05:09:28.168911] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.076 [2024-12-15 05:09:28.168994] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.076 [2024-12-15 05:09:28.169013] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:19:08.076 [2024-12-15 05:09:28.169025] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.015 ms 00:19:08.076 [2024-12-15 05:09:28.169034] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.076 [2024-12-15 05:09:28.169096] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.076 [2024-12-15 05:09:28.169104] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:19:08.076 [2024-12-15 05:09:28.169115] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.041 ms 00:19:08.076 [2024-12-15 05:09:28.169123] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.076 [2024-12-15 05:09:28.169149] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.076 [2024-12-15 05:09:28.169157] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:19:08.076 [2024-12-15 05:09:28.169175] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:19:08.076 [2024-12-15 05:09:28.169183] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.076 [2024-12-15 05:09:28.169224] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:19:08.076 [2024-12-15 05:09:28.169235] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.076 [2024-12-15 05:09:28.169245] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:19:08.076 [2024-12-15 05:09:28.169253] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.014 ms 00:19:08.076 [2024-12-15 05:09:28.169262] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.076 [2024-12-15 05:09:28.175496] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.076 [2024-12-15 05:09:28.175559] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:19:08.076 [2024-12-15 05:09:28.175570] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.211 ms 00:19:08.076 [2024-12-15 05:09:28.175583] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.076 [2024-12-15 05:09:28.175680] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.076 [2024-12-15 05:09:28.175693] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:19:08.076 [2024-12-15 05:09:28.175702] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.044 ms 00:19:08.076 [2024-12-15 05:09:28.175712] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.076 [2024-12-15 05:09:28.176761] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:19:08.076 [2024-12-15 05:09:28.178173] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 159.082 ms, result 0 00:19:08.076 [2024-12-15 05:09:28.180276] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:19:08.076 Some configs were skipped because the RPC state that can call them passed over. 00:19:08.337 05:09:28 ftl.ftl_trim -- ftl/trim.sh@78 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_unmap -b ftl0 --lba 0 --num_blocks 1024 00:19:08.337 [2024-12-15 05:09:28.413957] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.337 [2024-12-15 05:09:28.414148] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Process trim 00:19:08.337 [2024-12-15 05:09:28.414235] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.165 ms 00:19:08.337 [2024-12-15 05:09:28.414261] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.337 [2024-12-15 05:09:28.414320] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL trim', duration = 3.531 ms, result 0 00:19:08.337 true 00:19:08.337 05:09:28 ftl.ftl_trim -- ftl/trim.sh@79 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_unmap -b ftl0 --lba 23591936 --num_blocks 1024 00:19:08.599 [2024-12-15 05:09:28.638906] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.599 [2024-12-15 05:09:28.638972] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Process trim 00:19:08.599 [2024-12-15 05:09:28.638986] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.000 ms 00:19:08.599 [2024-12-15 05:09:28.638996] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.599 [2024-12-15 05:09:28.639034] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL trim', duration = 3.129 ms, result 0 00:19:08.599 true 00:19:08.599 05:09:28 ftl.ftl_trim -- ftl/trim.sh@81 -- # killprocess 89728 00:19:08.599 05:09:28 ftl.ftl_trim -- common/autotest_common.sh@954 -- # '[' -z 89728 ']' 00:19:08.599 05:09:28 ftl.ftl_trim -- common/autotest_common.sh@958 -- # kill -0 89728 00:19:08.599 05:09:28 ftl.ftl_trim -- common/autotest_common.sh@959 -- # uname 00:19:08.599 05:09:28 ftl.ftl_trim -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:19:08.599 05:09:28 ftl.ftl_trim -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 89728 00:19:08.599 killing process with pid 89728 00:19:08.599 05:09:28 ftl.ftl_trim -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:19:08.599 05:09:28 ftl.ftl_trim -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:19:08.599 05:09:28 ftl.ftl_trim -- common/autotest_common.sh@972 -- # echo 'killing process with pid 89728' 00:19:08.599 05:09:28 ftl.ftl_trim -- common/autotest_common.sh@973 -- # kill 89728 00:19:08.599 05:09:28 ftl.ftl_trim -- common/autotest_common.sh@978 -- # wait 89728 00:19:08.865 [2024-12-15 05:09:28.824836] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.865 [2024-12-15 05:09:28.824899] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:19:08.865 [2024-12-15 05:09:28.824916] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:19:08.865 [2024-12-15 05:09:28.824930] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.865 [2024-12-15 05:09:28.824962] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on ftl_core_thread 00:19:08.865 [2024-12-15 05:09:28.825606] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.865 [2024-12-15 05:09:28.825639] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:19:08.865 [2024-12-15 05:09:28.825653] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.630 ms 00:19:08.865 [2024-12-15 05:09:28.825662] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.865 [2024-12-15 05:09:28.825980] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.865 [2024-12-15 05:09:28.826000] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:19:08.865 [2024-12-15 05:09:28.826010] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.272 ms 00:19:08.865 [2024-12-15 05:09:28.826021] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.865 [2024-12-15 05:09:28.830496] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.865 [2024-12-15 05:09:28.830537] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:19:08.865 [2024-12-15 05:09:28.830548] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.453 ms 00:19:08.865 [2024-12-15 05:09:28.830565] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.865 [2024-12-15 05:09:28.837493] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.865 [2024-12-15 05:09:28.837536] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:19:08.865 [2024-12-15 05:09:28.837547] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.888 ms 00:19:08.865 [2024-12-15 05:09:28.837559] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.865 [2024-12-15 05:09:28.840120] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.865 [2024-12-15 05:09:28.840292] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:19:08.865 [2024-12-15 05:09:28.840310] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.482 ms 00:19:08.865 [2024-12-15 05:09:28.840319] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.865 [2024-12-15 05:09:28.845057] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.865 [2024-12-15 05:09:28.845112] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:19:08.865 [2024-12-15 05:09:28.845125] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.697 ms 00:19:08.865 [2024-12-15 05:09:28.845140] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.865 [2024-12-15 05:09:28.845280] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.865 [2024-12-15 05:09:28.845293] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:19:08.865 [2024-12-15 05:09:28.845302] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.095 ms 00:19:08.865 [2024-12-15 05:09:28.845313] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.865 [2024-12-15 05:09:28.847927] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.865 [2024-12-15 05:09:28.847993] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:19:08.865 [2024-12-15 05:09:28.848004] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.595 ms 00:19:08.865 [2024-12-15 05:09:28.848019] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.865 [2024-12-15 05:09:28.850838] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.865 [2024-12-15 05:09:28.850886] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:19:08.865 [2024-12-15 05:09:28.850896] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.763 ms 00:19:08.865 [2024-12-15 05:09:28.850905] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.865 [2024-12-15 05:09:28.853034] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.865 [2024-12-15 05:09:28.853082] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:19:08.865 [2024-12-15 05:09:28.853091] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.086 ms 00:19:08.865 [2024-12-15 05:09:28.853100] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.865 [2024-12-15 05:09:28.855270] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.865 [2024-12-15 05:09:28.855320] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:19:08.865 [2024-12-15 05:09:28.855330] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.101 ms 00:19:08.865 [2024-12-15 05:09:28.855339] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.865 [2024-12-15 05:09:28.855378] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:19:08.865 [2024-12-15 05:09:28.855395] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855405] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855417] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855425] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855452] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855461] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855470] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855478] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855487] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855495] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855506] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855514] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855524] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855531] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855540] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855548] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855557] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855565] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855585] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855593] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855602] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855609] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855619] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855626] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855635] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:19:08.865 [2024-12-15 05:09:28.855642] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855651] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855659] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855669] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855677] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855687] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855695] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855705] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855713] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855724] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855732] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855742] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855750] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855760] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855767] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855776] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855783] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855792] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855800] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855809] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855816] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855825] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855833] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855842] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855849] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855861] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855869] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855877] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855884] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855894] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855901] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855910] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855917] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855926] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855934] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855942] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855950] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855959] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855968] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855979] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855987] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.855998] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856006] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856015] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856023] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856032] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856040] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856050] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856058] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856092] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856100] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856110] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856117] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856127] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856134] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856145] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856159] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856170] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856178] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856187] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856194] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856204] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856211] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856222] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856229] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856238] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856246] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856255] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856262] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856276] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856286] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856295] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856304] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856315] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856322] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:19:08.866 [2024-12-15 05:09:28.856340] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:19:08.866 [2024-12-15 05:09:28.856348] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 3af9b670-83ba-4af2-8311-25d469da50d8 00:19:08.866 [2024-12-15 05:09:28.856360] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:19:08.866 [2024-12-15 05:09:28.856368] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:19:08.866 [2024-12-15 05:09:28.856382] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:19:08.866 [2024-12-15 05:09:28.856390] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:19:08.866 [2024-12-15 05:09:28.856399] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:19:08.866 [2024-12-15 05:09:28.856410] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:19:08.866 [2024-12-15 05:09:28.856419] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:19:08.866 [2024-12-15 05:09:28.856426] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:19:08.866 [2024-12-15 05:09:28.856450] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:19:08.866 [2024-12-15 05:09:28.856458] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.866 [2024-12-15 05:09:28.856468] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:19:08.866 [2024-12-15 05:09:28.856476] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.081 ms 00:19:08.866 [2024-12-15 05:09:28.856487] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.866 [2024-12-15 05:09:28.858476] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.866 [2024-12-15 05:09:28.858509] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:19:08.866 [2024-12-15 05:09:28.858519] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.964 ms 00:19:08.866 [2024-12-15 05:09:28.858533] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.866 [2024-12-15 05:09:28.858675] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:08.866 [2024-12-15 05:09:28.858699] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:19:08.866 [2024-12-15 05:09:28.858708] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.090 ms 00:19:08.867 [2024-12-15 05:09:28.858718] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.867 [2024-12-15 05:09:28.866268] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:08.867 [2024-12-15 05:09:28.866464] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:19:08.867 [2024-12-15 05:09:28.866488] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:08.867 [2024-12-15 05:09:28.866499] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.867 [2024-12-15 05:09:28.866576] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:08.867 [2024-12-15 05:09:28.866587] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:19:08.867 [2024-12-15 05:09:28.866597] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:08.867 [2024-12-15 05:09:28.866609] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.867 [2024-12-15 05:09:28.866660] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:08.867 [2024-12-15 05:09:28.866676] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:19:08.867 [2024-12-15 05:09:28.866687] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:08.867 [2024-12-15 05:09:28.866697] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.867 [2024-12-15 05:09:28.866716] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:08.867 [2024-12-15 05:09:28.866727] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:19:08.867 [2024-12-15 05:09:28.866734] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:08.867 [2024-12-15 05:09:28.866743] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.867 [2024-12-15 05:09:28.880765] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:08.867 [2024-12-15 05:09:28.880833] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:19:08.867 [2024-12-15 05:09:28.880845] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:08.867 [2024-12-15 05:09:28.880862] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.867 [2024-12-15 05:09:28.891559] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:08.867 [2024-12-15 05:09:28.891618] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:19:08.867 [2024-12-15 05:09:28.891631] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:08.867 [2024-12-15 05:09:28.891645] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.867 [2024-12-15 05:09:28.891716] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:08.867 [2024-12-15 05:09:28.891729] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:19:08.867 [2024-12-15 05:09:28.891738] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:08.867 [2024-12-15 05:09:28.891749] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.867 [2024-12-15 05:09:28.891783] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:08.867 [2024-12-15 05:09:28.891795] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:19:08.867 [2024-12-15 05:09:28.891803] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:08.867 [2024-12-15 05:09:28.891814] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.867 [2024-12-15 05:09:28.891899] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:08.867 [2024-12-15 05:09:28.891915] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:19:08.867 [2024-12-15 05:09:28.891924] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:08.867 [2024-12-15 05:09:28.891934] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.867 [2024-12-15 05:09:28.891967] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:08.867 [2024-12-15 05:09:28.891979] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:19:08.867 [2024-12-15 05:09:28.891987] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:08.867 [2024-12-15 05:09:28.892000] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.867 [2024-12-15 05:09:28.892047] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:08.867 [2024-12-15 05:09:28.892085] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:19:08.867 [2024-12-15 05:09:28.892097] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:08.867 [2024-12-15 05:09:28.892106] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.867 [2024-12-15 05:09:28.892156] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:08.867 [2024-12-15 05:09:28.892170] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:19:08.867 [2024-12-15 05:09:28.892179] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:08.867 [2024-12-15 05:09:28.892191] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:08.867 [2024-12-15 05:09:28.892349] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 67.480 ms, result 0 00:19:09.153 05:09:29 ftl.ftl_trim -- ftl/trim.sh@84 -- # file=/home/vagrant/spdk_repo/spdk/test/ftl/data 00:19:09.153 05:09:29 ftl.ftl_trim -- ftl/trim.sh@85 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=ftl0 --of=/home/vagrant/spdk_repo/spdk/test/ftl/data --count=65536 --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:19:09.153 [2024-12-15 05:09:29.195879] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:19:09.153 [2024-12-15 05:09:29.196050] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid89770 ] 00:19:09.421 [2024-12-15 05:09:29.359044] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:09.421 [2024-12-15 05:09:29.387585] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:19:09.421 [2024-12-15 05:09:29.506672] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:19:09.421 [2024-12-15 05:09:29.506766] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:19:09.683 [2024-12-15 05:09:29.668974] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.683 [2024-12-15 05:09:29.669206] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:19:09.683 [2024-12-15 05:09:29.669232] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:19:09.683 [2024-12-15 05:09:29.669242] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.683 [2024-12-15 05:09:29.671843] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.683 [2024-12-15 05:09:29.671893] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:19:09.683 [2024-12-15 05:09:29.671906] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.561 ms 00:19:09.683 [2024-12-15 05:09:29.671914] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.683 [2024-12-15 05:09:29.672026] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:19:09.683 [2024-12-15 05:09:29.672316] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:19:09.683 [2024-12-15 05:09:29.672335] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.683 [2024-12-15 05:09:29.672345] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:19:09.683 [2024-12-15 05:09:29.672355] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.323 ms 00:19:09.683 [2024-12-15 05:09:29.672364] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.683 [2024-12-15 05:09:29.674394] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 0, shm_clean 0 00:19:09.683 [2024-12-15 05:09:29.678384] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.683 [2024-12-15 05:09:29.678578] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:19:09.683 [2024-12-15 05:09:29.678658] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.992 ms 00:19:09.683 [2024-12-15 05:09:29.678689] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.683 [2024-12-15 05:09:29.678778] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.683 [2024-12-15 05:09:29.678807] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:19:09.683 [2024-12-15 05:09:29.678828] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.028 ms 00:19:09.683 [2024-12-15 05:09:29.678900] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.683 [2024-12-15 05:09:29.688406] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.683 [2024-12-15 05:09:29.688672] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:19:09.683 [2024-12-15 05:09:29.688703] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 9.431 ms 00:19:09.684 [2024-12-15 05:09:29.688717] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.684 [2024-12-15 05:09:29.688908] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.684 [2024-12-15 05:09:29.688931] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:19:09.684 [2024-12-15 05:09:29.688945] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.089 ms 00:19:09.684 [2024-12-15 05:09:29.688960] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.684 [2024-12-15 05:09:29.689006] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.684 [2024-12-15 05:09:29.689019] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:19:09.684 [2024-12-15 05:09:29.689035] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.017 ms 00:19:09.684 [2024-12-15 05:09:29.689047] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.684 [2024-12-15 05:09:29.689080] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on ftl_core_thread 00:19:09.684 [2024-12-15 05:09:29.691604] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.684 [2024-12-15 05:09:29.691667] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:19:09.684 [2024-12-15 05:09:29.691689] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.534 ms 00:19:09.684 [2024-12-15 05:09:29.691714] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.684 [2024-12-15 05:09:29.691792] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.684 [2024-12-15 05:09:29.691819] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:19:09.684 [2024-12-15 05:09:29.691833] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.022 ms 00:19:09.684 [2024-12-15 05:09:29.691846] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.684 [2024-12-15 05:09:29.691878] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:19:09.684 [2024-12-15 05:09:29.691909] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:19:09.684 [2024-12-15 05:09:29.691959] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:19:09.684 [2024-12-15 05:09:29.691988] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:19:09.684 [2024-12-15 05:09:29.692156] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:19:09.684 [2024-12-15 05:09:29.692179] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:19:09.684 [2024-12-15 05:09:29.692196] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:19:09.684 [2024-12-15 05:09:29.692214] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:19:09.684 [2024-12-15 05:09:29.692236] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:19:09.684 [2024-12-15 05:09:29.692252] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 23592960 00:19:09.684 [2024-12-15 05:09:29.692270] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:19:09.684 [2024-12-15 05:09:29.692283] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:19:09.684 [2024-12-15 05:09:29.692308] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:19:09.684 [2024-12-15 05:09:29.692331] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.684 [2024-12-15 05:09:29.692344] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:19:09.684 [2024-12-15 05:09:29.692357] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.452 ms 00:19:09.684 [2024-12-15 05:09:29.692369] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.684 [2024-12-15 05:09:29.692497] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.684 [2024-12-15 05:09:29.692517] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:19:09.684 [2024-12-15 05:09:29.692532] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.095 ms 00:19:09.684 [2024-12-15 05:09:29.692542] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.684 [2024-12-15 05:09:29.692668] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:19:09.684 [2024-12-15 05:09:29.692698] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:19:09.684 [2024-12-15 05:09:29.692707] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:19:09.684 [2024-12-15 05:09:29.692716] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:09.684 [2024-12-15 05:09:29.692725] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:19:09.684 [2024-12-15 05:09:29.692733] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:19:09.684 [2024-12-15 05:09:29.692739] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 90.00 MiB 00:19:09.684 [2024-12-15 05:09:29.692749] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:19:09.684 [2024-12-15 05:09:29.692757] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 90.12 MiB 00:19:09.684 [2024-12-15 05:09:29.692766] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:19:09.684 [2024-12-15 05:09:29.692772] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:19:09.684 [2024-12-15 05:09:29.692779] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 90.62 MiB 00:19:09.684 [2024-12-15 05:09:29.692786] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:19:09.684 [2024-12-15 05:09:29.692793] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:19:09.684 [2024-12-15 05:09:29.692800] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.88 MiB 00:19:09.684 [2024-12-15 05:09:29.692807] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:09.684 [2024-12-15 05:09:29.692815] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:19:09.684 [2024-12-15 05:09:29.692822] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 124.00 MiB 00:19:09.684 [2024-12-15 05:09:29.692829] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:09.684 [2024-12-15 05:09:29.692835] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:19:09.684 [2024-12-15 05:09:29.692842] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 91.12 MiB 00:19:09.684 [2024-12-15 05:09:29.692850] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:09.684 [2024-12-15 05:09:29.692857] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:19:09.684 [2024-12-15 05:09:29.692868] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 99.12 MiB 00:19:09.684 [2024-12-15 05:09:29.692874] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:09.684 [2024-12-15 05:09:29.692882] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:19:09.684 [2024-12-15 05:09:29.692888] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 107.12 MiB 00:19:09.684 [2024-12-15 05:09:29.692898] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:09.684 [2024-12-15 05:09:29.692905] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:19:09.684 [2024-12-15 05:09:29.692911] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 115.12 MiB 00:19:09.684 [2024-12-15 05:09:29.692918] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:09.684 [2024-12-15 05:09:29.692925] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:19:09.684 [2024-12-15 05:09:29.692932] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.12 MiB 00:19:09.684 [2024-12-15 05:09:29.692938] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:19:09.684 [2024-12-15 05:09:29.692944] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:19:09.684 [2024-12-15 05:09:29.692951] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.38 MiB 00:19:09.684 [2024-12-15 05:09:29.692958] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:19:09.684 [2024-12-15 05:09:29.692966] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:19:09.684 [2024-12-15 05:09:29.692972] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.62 MiB 00:19:09.684 [2024-12-15 05:09:29.692981] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:09.684 [2024-12-15 05:09:29.692989] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:19:09.684 [2024-12-15 05:09:29.692996] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.75 MiB 00:19:09.684 [2024-12-15 05:09:29.693002] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:09.684 [2024-12-15 05:09:29.693009] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:19:09.684 [2024-12-15 05:09:29.693020] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:19:09.684 [2024-12-15 05:09:29.693028] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:19:09.684 [2024-12-15 05:09:29.693036] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:09.684 [2024-12-15 05:09:29.693045] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:19:09.684 [2024-12-15 05:09:29.693052] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:19:09.684 [2024-12-15 05:09:29.693058] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:19:09.684 [2024-12-15 05:09:29.693065] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:19:09.684 [2024-12-15 05:09:29.693071] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:19:09.684 [2024-12-15 05:09:29.693079] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:19:09.684 [2024-12-15 05:09:29.693087] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:19:09.684 [2024-12-15 05:09:29.693097] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:19:09.684 [2024-12-15 05:09:29.693107] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5a00 00:19:09.684 [2024-12-15 05:09:29.693115] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5a20 blk_sz:0x80 00:19:09.684 [2024-12-15 05:09:29.693122] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x5aa0 blk_sz:0x80 00:19:09.684 [2024-12-15 05:09:29.693129] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5b20 blk_sz:0x800 00:19:09.684 [2024-12-15 05:09:29.693137] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x6320 blk_sz:0x800 00:19:09.684 [2024-12-15 05:09:29.693144] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6b20 blk_sz:0x800 00:19:09.684 [2024-12-15 05:09:29.693151] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x7320 blk_sz:0x800 00:19:09.684 [2024-12-15 05:09:29.693158] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7b20 blk_sz:0x40 00:19:09.684 [2024-12-15 05:09:29.693165] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7b60 blk_sz:0x40 00:19:09.684 [2024-12-15 05:09:29.693173] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x7ba0 blk_sz:0x20 00:19:09.685 [2024-12-15 05:09:29.693179] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x7bc0 blk_sz:0x20 00:19:09.685 [2024-12-15 05:09:29.693186] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x7be0 blk_sz:0x20 00:19:09.685 [2024-12-15 05:09:29.693194] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7c00 blk_sz:0x20 00:19:09.685 [2024-12-15 05:09:29.693201] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7c20 blk_sz:0x13b6e0 00:19:09.685 [2024-12-15 05:09:29.693209] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:19:09.685 [2024-12-15 05:09:29.693220] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:19:09.685 [2024-12-15 05:09:29.693234] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:19:09.685 [2024-12-15 05:09:29.693242] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:19:09.685 [2024-12-15 05:09:29.693249] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:19:09.685 [2024-12-15 05:09:29.693257] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:19:09.685 [2024-12-15 05:09:29.693265] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.685 [2024-12-15 05:09:29.693273] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:19:09.685 [2024-12-15 05:09:29.693281] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.678 ms 00:19:09.685 [2024-12-15 05:09:29.693288] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.685 [2024-12-15 05:09:29.709055] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.685 [2024-12-15 05:09:29.709114] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:19:09.685 [2024-12-15 05:09:29.709127] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 15.690 ms 00:19:09.685 [2024-12-15 05:09:29.709136] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.685 [2024-12-15 05:09:29.709274] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.685 [2024-12-15 05:09:29.709293] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:19:09.685 [2024-12-15 05:09:29.709302] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.067 ms 00:19:09.685 [2024-12-15 05:09:29.709310] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.685 [2024-12-15 05:09:29.736166] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.685 [2024-12-15 05:09:29.736231] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:19:09.685 [2024-12-15 05:09:29.736244] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 26.829 ms 00:19:09.685 [2024-12-15 05:09:29.736252] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.685 [2024-12-15 05:09:29.736355] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.685 [2024-12-15 05:09:29.736368] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:19:09.685 [2024-12-15 05:09:29.736377] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:19:09.685 [2024-12-15 05:09:29.736386] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.685 [2024-12-15 05:09:29.736982] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.685 [2024-12-15 05:09:29.737020] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:19:09.685 [2024-12-15 05:09:29.737034] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.572 ms 00:19:09.685 [2024-12-15 05:09:29.737044] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.685 [2024-12-15 05:09:29.737215] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.685 [2024-12-15 05:09:29.737240] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:19:09.685 [2024-12-15 05:09:29.737249] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.129 ms 00:19:09.685 [2024-12-15 05:09:29.737257] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.685 [2024-12-15 05:09:29.746200] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.685 [2024-12-15 05:09:29.746250] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:19:09.685 [2024-12-15 05:09:29.746261] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.914 ms 00:19:09.685 [2024-12-15 05:09:29.746275] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.685 [2024-12-15 05:09:29.750427] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 1, empty chunks = 3 00:19:09.685 [2024-12-15 05:09:29.750499] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:19:09.685 [2024-12-15 05:09:29.750512] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.685 [2024-12-15 05:09:29.750521] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:19:09.685 [2024-12-15 05:09:29.750530] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.113 ms 00:19:09.685 [2024-12-15 05:09:29.750538] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.685 [2024-12-15 05:09:29.771337] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.685 [2024-12-15 05:09:29.771392] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:19:09.685 [2024-12-15 05:09:29.771406] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 20.717 ms 00:19:09.685 [2024-12-15 05:09:29.771415] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.685 [2024-12-15 05:09:29.774497] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.685 [2024-12-15 05:09:29.774536] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:19:09.685 [2024-12-15 05:09:29.774547] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.931 ms 00:19:09.685 [2024-12-15 05:09:29.774555] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.685 [2024-12-15 05:09:29.777361] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.685 [2024-12-15 05:09:29.777412] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:19:09.685 [2024-12-15 05:09:29.777422] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.739 ms 00:19:09.685 [2024-12-15 05:09:29.777429] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.685 [2024-12-15 05:09:29.777802] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.685 [2024-12-15 05:09:29.777815] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:19:09.685 [2024-12-15 05:09:29.777825] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.279 ms 00:19:09.685 [2024-12-15 05:09:29.777833] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.685 [2024-12-15 05:09:29.805706] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.685 [2024-12-15 05:09:29.805779] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:19:09.685 [2024-12-15 05:09:29.805795] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 27.850 ms 00:19:09.685 [2024-12-15 05:09:29.805808] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.685 [2024-12-15 05:09:29.814294] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 59 (of 60) MiB 00:19:09.946 [2024-12-15 05:09:29.834907] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.946 [2024-12-15 05:09:29.834969] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:19:09.946 [2024-12-15 05:09:29.834984] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 28.983 ms 00:19:09.946 [2024-12-15 05:09:29.834993] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.946 [2024-12-15 05:09:29.835103] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.946 [2024-12-15 05:09:29.835115] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:19:09.946 [2024-12-15 05:09:29.835129] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.017 ms 00:19:09.946 [2024-12-15 05:09:29.835138] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.946 [2024-12-15 05:09:29.835197] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.946 [2024-12-15 05:09:29.835207] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:19:09.946 [2024-12-15 05:09:29.835216] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.036 ms 00:19:09.946 [2024-12-15 05:09:29.835225] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.946 [2024-12-15 05:09:29.835254] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.946 [2024-12-15 05:09:29.835264] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:19:09.946 [2024-12-15 05:09:29.835273] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.012 ms 00:19:09.946 [2024-12-15 05:09:29.835284] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.946 [2024-12-15 05:09:29.835324] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:19:09.946 [2024-12-15 05:09:29.835336] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.946 [2024-12-15 05:09:29.835345] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:19:09.946 [2024-12-15 05:09:29.835353] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.013 ms 00:19:09.946 [2024-12-15 05:09:29.835362] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.946 [2024-12-15 05:09:29.841876] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.946 [2024-12-15 05:09:29.841933] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:19:09.946 [2024-12-15 05:09:29.841946] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.490 ms 00:19:09.946 [2024-12-15 05:09:29.841963] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.946 [2024-12-15 05:09:29.842059] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:09.946 [2024-12-15 05:09:29.842071] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:19:09.946 [2024-12-15 05:09:29.842081] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.038 ms 00:19:09.946 [2024-12-15 05:09:29.842096] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:09.946 [2024-12-15 05:09:29.843170] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:19:09.946 [2024-12-15 05:09:29.844681] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 173.872 ms, result 0 00:19:09.946 [2024-12-15 05:09:29.845846] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:19:09.946 [2024-12-15 05:09:29.853494] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:19:10.888  [2024-12-15T05:09:31.972Z] Copying: 14/256 [MB] (14 MBps) [2024-12-15T05:09:32.915Z] Copying: 28/256 [MB] (14 MBps) [2024-12-15T05:09:34.300Z] Copying: 45/256 [MB] (16 MBps) [2024-12-15T05:09:34.872Z] Copying: 56/256 [MB] (10 MBps) [2024-12-15T05:09:36.257Z] Copying: 71/256 [MB] (15 MBps) [2024-12-15T05:09:37.200Z] Copying: 84/256 [MB] (13 MBps) [2024-12-15T05:09:38.143Z] Copying: 109/256 [MB] (25 MBps) [2024-12-15T05:09:39.086Z] Copying: 127/256 [MB] (17 MBps) [2024-12-15T05:09:40.030Z] Copying: 141/256 [MB] (14 MBps) [2024-12-15T05:09:40.975Z] Copying: 153/256 [MB] (12 MBps) [2024-12-15T05:09:41.916Z] Copying: 167784/262144 [kB] (10104 kBps) [2024-12-15T05:09:42.860Z] Copying: 181/256 [MB] (18 MBps) [2024-12-15T05:09:44.247Z] Copying: 196/256 [MB] (14 MBps) [2024-12-15T05:09:45.191Z] Copying: 213/256 [MB] (16 MBps) [2024-12-15T05:09:46.134Z] Copying: 230/256 [MB] (17 MBps) [2024-12-15T05:09:46.709Z] Copying: 244/256 [MB] (14 MBps) [2024-12-15T05:09:46.709Z] Copying: 256/256 [MB] (average 15 MBps)[2024-12-15 05:09:46.481772] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:19:26.569 [2024-12-15 05:09:46.482808] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.569 [2024-12-15 05:09:46.482843] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:19:26.569 [2024-12-15 05:09:46.482853] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.002 ms 00:19:26.569 [2024-12-15 05:09:46.482860] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.569 [2024-12-15 05:09:46.482879] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on ftl_core_thread 00:19:26.569 [2024-12-15 05:09:46.483256] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.569 [2024-12-15 05:09:46.483270] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:19:26.569 [2024-12-15 05:09:46.483276] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.367 ms 00:19:26.569 [2024-12-15 05:09:46.483282] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.569 [2024-12-15 05:09:46.483500] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.569 [2024-12-15 05:09:46.483509] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:19:26.569 [2024-12-15 05:09:46.483518] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.203 ms 00:19:26.569 [2024-12-15 05:09:46.483524] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.569 [2024-12-15 05:09:46.486355] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.569 [2024-12-15 05:09:46.486375] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:19:26.569 [2024-12-15 05:09:46.486382] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.819 ms 00:19:26.569 [2024-12-15 05:09:46.486389] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.569 [2024-12-15 05:09:46.491591] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.569 [2024-12-15 05:09:46.491621] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:19:26.569 [2024-12-15 05:09:46.491628] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.186 ms 00:19:26.569 [2024-12-15 05:09:46.491636] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.569 [2024-12-15 05:09:46.493059] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.569 [2024-12-15 05:09:46.493180] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:19:26.569 [2024-12-15 05:09:46.493192] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.384 ms 00:19:26.569 [2024-12-15 05:09:46.493197] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.569 [2024-12-15 05:09:46.496661] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.569 [2024-12-15 05:09:46.496690] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:19:26.569 [2024-12-15 05:09:46.496697] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.445 ms 00:19:26.569 [2024-12-15 05:09:46.496702] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.569 [2024-12-15 05:09:46.496797] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.569 [2024-12-15 05:09:46.496805] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:19:26.569 [2024-12-15 05:09:46.496813] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.067 ms 00:19:26.569 [2024-12-15 05:09:46.496819] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.569 [2024-12-15 05:09:46.498680] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.569 [2024-12-15 05:09:46.498706] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:19:26.569 [2024-12-15 05:09:46.498713] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.849 ms 00:19:26.570 [2024-12-15 05:09:46.498718] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.570 [2024-12-15 05:09:46.499958] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.570 [2024-12-15 05:09:46.500061] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:19:26.570 [2024-12-15 05:09:46.500080] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.225 ms 00:19:26.570 [2024-12-15 05:09:46.500086] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.570 [2024-12-15 05:09:46.501174] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.570 [2024-12-15 05:09:46.501198] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:19:26.570 [2024-12-15 05:09:46.501204] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.071 ms 00:19:26.570 [2024-12-15 05:09:46.501209] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.570 [2024-12-15 05:09:46.502247] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.570 [2024-12-15 05:09:46.502275] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:19:26.570 [2024-12-15 05:09:46.502282] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.003 ms 00:19:26.570 [2024-12-15 05:09:46.502287] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.570 [2024-12-15 05:09:46.502301] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:19:26.570 [2024-12-15 05:09:46.502311] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502319] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502325] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502330] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502336] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502342] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502348] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502353] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502358] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502364] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502369] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502375] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502381] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502386] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502392] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502397] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502403] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502408] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502413] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502419] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502424] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502429] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502449] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502455] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502461] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502466] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502473] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502478] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502484] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502489] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502495] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502501] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502506] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502513] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502519] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502525] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502530] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502536] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502541] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502547] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502553] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502558] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502564] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502569] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502575] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502580] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502586] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502591] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502597] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502602] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502607] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502613] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502618] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502624] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502630] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502636] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502641] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502647] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502652] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502658] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502663] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502668] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:19:26.570 [2024-12-15 05:09:46.502674] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502679] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502684] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502691] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502697] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502702] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502708] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502713] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502719] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502725] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502730] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502736] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502741] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502747] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502752] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502757] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502763] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502768] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502774] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502779] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502785] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502790] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502796] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502801] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502806] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502812] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502817] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502823] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502828] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502833] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502839] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502844] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502849] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502855] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502866] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502873] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502879] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502884] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:19:26.571 [2024-12-15 05:09:46.502896] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:19:26.571 [2024-12-15 05:09:46.502902] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 3af9b670-83ba-4af2-8311-25d469da50d8 00:19:26.571 [2024-12-15 05:09:46.502908] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:19:26.571 [2024-12-15 05:09:46.502913] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:19:26.571 [2024-12-15 05:09:46.502919] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:19:26.571 [2024-12-15 05:09:46.502924] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:19:26.571 [2024-12-15 05:09:46.502929] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:19:26.571 [2024-12-15 05:09:46.502940] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:19:26.571 [2024-12-15 05:09:46.502946] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:19:26.571 [2024-12-15 05:09:46.502951] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:19:26.571 [2024-12-15 05:09:46.502955] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:19:26.571 [2024-12-15 05:09:46.502960] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.571 [2024-12-15 05:09:46.502966] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:19:26.571 [2024-12-15 05:09:46.502972] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.660 ms 00:19:26.571 [2024-12-15 05:09:46.502977] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.571 [2024-12-15 05:09:46.504430] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.571 [2024-12-15 05:09:46.504528] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:19:26.571 [2024-12-15 05:09:46.504572] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.435 ms 00:19:26.571 [2024-12-15 05:09:46.504593] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.571 [2024-12-15 05:09:46.504671] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:26.571 [2024-12-15 05:09:46.504738] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:19:26.571 [2024-12-15 05:09:46.504757] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.050 ms 00:19:26.571 [2024-12-15 05:09:46.504771] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.571 [2024-12-15 05:09:46.509098] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:26.571 [2024-12-15 05:09:46.509197] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:19:26.571 [2024-12-15 05:09:46.509242] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:26.571 [2024-12-15 05:09:46.509264] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.571 [2024-12-15 05:09:46.509318] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:26.571 [2024-12-15 05:09:46.509360] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:19:26.571 [2024-12-15 05:09:46.509397] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:26.571 [2024-12-15 05:09:46.509414] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.571 [2024-12-15 05:09:46.509472] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:26.571 [2024-12-15 05:09:46.509571] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:19:26.571 [2024-12-15 05:09:46.509590] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:26.571 [2024-12-15 05:09:46.509609] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.571 [2024-12-15 05:09:46.509635] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:26.571 [2024-12-15 05:09:46.509679] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:19:26.571 [2024-12-15 05:09:46.509696] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:26.571 [2024-12-15 05:09:46.509710] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.571 [2024-12-15 05:09:46.517387] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:26.571 [2024-12-15 05:09:46.517514] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:19:26.571 [2024-12-15 05:09:46.517559] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:26.571 [2024-12-15 05:09:46.517581] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.571 [2024-12-15 05:09:46.523821] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:26.571 [2024-12-15 05:09:46.523933] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:19:26.571 [2024-12-15 05:09:46.523975] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:26.571 [2024-12-15 05:09:46.523992] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.571 [2024-12-15 05:09:46.524021] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:26.571 [2024-12-15 05:09:46.524115] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:19:26.571 [2024-12-15 05:09:46.524134] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:26.571 [2024-12-15 05:09:46.524149] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.571 [2024-12-15 05:09:46.524185] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:26.571 [2024-12-15 05:09:46.524266] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:19:26.571 [2024-12-15 05:09:46.524276] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:26.571 [2024-12-15 05:09:46.524282] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.571 [2024-12-15 05:09:46.524339] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:26.571 [2024-12-15 05:09:46.524347] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:19:26.571 [2024-12-15 05:09:46.524353] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:26.571 [2024-12-15 05:09:46.524359] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.571 [2024-12-15 05:09:46.524382] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:26.571 [2024-12-15 05:09:46.524392] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:19:26.572 [2024-12-15 05:09:46.524398] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:26.572 [2024-12-15 05:09:46.524403] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.572 [2024-12-15 05:09:46.524556] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:26.572 [2024-12-15 05:09:46.524579] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:19:26.572 [2024-12-15 05:09:46.524594] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:26.572 [2024-12-15 05:09:46.524609] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.572 [2024-12-15 05:09:46.524657] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:26.572 [2024-12-15 05:09:46.524695] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:19:26.572 [2024-12-15 05:09:46.524710] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:26.572 [2024-12-15 05:09:46.524724] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:26.572 [2024-12-15 05:09:46.524838] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 42.012 ms, result 0 00:19:26.572 00:19:26.572 00:19:26.572 05:09:46 ftl.ftl_trim -- ftl/trim.sh@86 -- # cmp --bytes=4194304 /home/vagrant/spdk_repo/spdk/test/ftl/data /dev/zero 00:19:26.832 05:09:46 ftl.ftl_trim -- ftl/trim.sh@87 -- # md5sum /home/vagrant/spdk_repo/spdk/test/ftl/data 00:19:27.403 05:09:47 ftl.ftl_trim -- ftl/trim.sh@90 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/ftl/random_pattern --ob=ftl0 --count=1024 --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:19:27.403 [2024-12-15 05:09:47.317355] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:19:27.403 [2024-12-15 05:09:47.317518] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid89957 ] 00:19:27.403 [2024-12-15 05:09:47.478573] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:27.403 [2024-12-15 05:09:47.506998] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:19:27.665 [2024-12-15 05:09:47.622202] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:19:27.665 [2024-12-15 05:09:47.622593] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:19:27.665 [2024-12-15 05:09:47.783172] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.665 [2024-12-15 05:09:47.783232] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:19:27.665 [2024-12-15 05:09:47.783248] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:19:27.665 [2024-12-15 05:09:47.783256] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.665 [2024-12-15 05:09:47.785835] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.665 [2024-12-15 05:09:47.786054] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:19:27.665 [2024-12-15 05:09:47.786075] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.558 ms 00:19:27.665 [2024-12-15 05:09:47.786091] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.665 [2024-12-15 05:09:47.786723] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:19:27.665 [2024-12-15 05:09:47.787155] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:19:27.665 [2024-12-15 05:09:47.787211] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.665 [2024-12-15 05:09:47.787221] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:19:27.665 [2024-12-15 05:09:47.787231] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.512 ms 00:19:27.665 [2024-12-15 05:09:47.787239] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.665 [2024-12-15 05:09:47.788990] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 0, shm_clean 0 00:19:27.665 [2024-12-15 05:09:47.792690] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.665 [2024-12-15 05:09:47.792880] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:19:27.665 [2024-12-15 05:09:47.792908] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.702 ms 00:19:27.665 [2024-12-15 05:09:47.792916] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.665 [2024-12-15 05:09:47.792995] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.665 [2024-12-15 05:09:47.793010] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:19:27.665 [2024-12-15 05:09:47.793019] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.028 ms 00:19:27.665 [2024-12-15 05:09:47.793027] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.665 [2024-12-15 05:09:47.801235] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.665 [2024-12-15 05:09:47.801276] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:19:27.665 [2024-12-15 05:09:47.801288] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.161 ms 00:19:27.665 [2024-12-15 05:09:47.801304] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.665 [2024-12-15 05:09:47.801468] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.665 [2024-12-15 05:09:47.801481] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:19:27.665 [2024-12-15 05:09:47.801492] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.098 ms 00:19:27.665 [2024-12-15 05:09:47.801503] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.665 [2024-12-15 05:09:47.801532] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.665 [2024-12-15 05:09:47.801541] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:19:27.665 [2024-12-15 05:09:47.801550] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.008 ms 00:19:27.665 [2024-12-15 05:09:47.801558] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.665 [2024-12-15 05:09:47.801579] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on ftl_core_thread 00:19:27.927 [2024-12-15 05:09:47.803582] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.927 [2024-12-15 05:09:47.803730] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:19:27.927 [2024-12-15 05:09:47.803746] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.009 ms 00:19:27.927 [2024-12-15 05:09:47.803760] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.927 [2024-12-15 05:09:47.803810] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.927 [2024-12-15 05:09:47.803823] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:19:27.927 [2024-12-15 05:09:47.803831] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.014 ms 00:19:27.927 [2024-12-15 05:09:47.803839] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.927 [2024-12-15 05:09:47.803858] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:19:27.927 [2024-12-15 05:09:47.803880] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:19:27.927 [2024-12-15 05:09:47.803916] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:19:27.927 [2024-12-15 05:09:47.803935] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:19:27.927 [2024-12-15 05:09:47.804042] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:19:27.927 [2024-12-15 05:09:47.804053] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:19:27.927 [2024-12-15 05:09:47.804064] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:19:27.927 [2024-12-15 05:09:47.804088] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:19:27.927 [2024-12-15 05:09:47.804105] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:19:27.927 [2024-12-15 05:09:47.804114] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 23592960 00:19:27.927 [2024-12-15 05:09:47.804121] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:19:27.927 [2024-12-15 05:09:47.804129] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:19:27.927 [2024-12-15 05:09:47.804140] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:19:27.927 [2024-12-15 05:09:47.804150] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.927 [2024-12-15 05:09:47.804158] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:19:27.927 [2024-12-15 05:09:47.804166] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.292 ms 00:19:27.927 [2024-12-15 05:09:47.804173] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.927 [2024-12-15 05:09:47.804262] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.927 [2024-12-15 05:09:47.804271] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:19:27.927 [2024-12-15 05:09:47.804279] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.070 ms 00:19:27.927 [2024-12-15 05:09:47.804287] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.927 [2024-12-15 05:09:47.804388] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:19:27.927 [2024-12-15 05:09:47.804407] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:19:27.927 [2024-12-15 05:09:47.804416] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:19:27.927 [2024-12-15 05:09:47.804426] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:27.927 [2024-12-15 05:09:47.804466] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:19:27.927 [2024-12-15 05:09:47.804475] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:19:27.927 [2024-12-15 05:09:47.804483] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 90.00 MiB 00:19:27.927 [2024-12-15 05:09:47.804493] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:19:27.927 [2024-12-15 05:09:47.804502] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 90.12 MiB 00:19:27.927 [2024-12-15 05:09:47.804511] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:19:27.927 [2024-12-15 05:09:47.804519] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:19:27.927 [2024-12-15 05:09:47.804528] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 90.62 MiB 00:19:27.927 [2024-12-15 05:09:47.804536] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:19:27.927 [2024-12-15 05:09:47.804544] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:19:27.927 [2024-12-15 05:09:47.804552] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.88 MiB 00:19:27.927 [2024-12-15 05:09:47.804562] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:27.927 [2024-12-15 05:09:47.804571] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:19:27.927 [2024-12-15 05:09:47.804579] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 124.00 MiB 00:19:27.927 [2024-12-15 05:09:47.804587] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:27.927 [2024-12-15 05:09:47.804595] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:19:27.927 [2024-12-15 05:09:47.804603] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 91.12 MiB 00:19:27.927 [2024-12-15 05:09:47.804611] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:27.927 [2024-12-15 05:09:47.804619] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:19:27.927 [2024-12-15 05:09:47.804633] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 99.12 MiB 00:19:27.927 [2024-12-15 05:09:47.804641] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:27.927 [2024-12-15 05:09:47.804650] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:19:27.927 [2024-12-15 05:09:47.804658] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 107.12 MiB 00:19:27.927 [2024-12-15 05:09:47.804666] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:27.927 [2024-12-15 05:09:47.804673] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:19:27.927 [2024-12-15 05:09:47.804681] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 115.12 MiB 00:19:27.927 [2024-12-15 05:09:47.804689] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:27.927 [2024-12-15 05:09:47.804698] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:19:27.927 [2024-12-15 05:09:47.804706] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.12 MiB 00:19:27.927 [2024-12-15 05:09:47.804714] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:19:27.927 [2024-12-15 05:09:47.804722] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:19:27.927 [2024-12-15 05:09:47.804736] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.38 MiB 00:19:27.927 [2024-12-15 05:09:47.804744] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:19:27.927 [2024-12-15 05:09:47.804750] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:19:27.927 [2024-12-15 05:09:47.804757] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.62 MiB 00:19:27.927 [2024-12-15 05:09:47.804766] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:27.927 [2024-12-15 05:09:47.804772] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:19:27.927 [2024-12-15 05:09:47.804779] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.75 MiB 00:19:27.927 [2024-12-15 05:09:47.804785] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:27.927 [2024-12-15 05:09:47.804792] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:19:27.927 [2024-12-15 05:09:47.804799] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:19:27.927 [2024-12-15 05:09:47.804807] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:19:27.927 [2024-12-15 05:09:47.804814] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:27.927 [2024-12-15 05:09:47.804824] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:19:27.927 [2024-12-15 05:09:47.804831] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:19:27.928 [2024-12-15 05:09:47.804838] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:19:27.928 [2024-12-15 05:09:47.804846] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:19:27.928 [2024-12-15 05:09:47.804852] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:19:27.928 [2024-12-15 05:09:47.804860] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:19:27.928 [2024-12-15 05:09:47.804868] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:19:27.928 [2024-12-15 05:09:47.804878] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:19:27.928 [2024-12-15 05:09:47.804889] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5a00 00:19:27.928 [2024-12-15 05:09:47.804896] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5a20 blk_sz:0x80 00:19:27.928 [2024-12-15 05:09:47.804903] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x5aa0 blk_sz:0x80 00:19:27.928 [2024-12-15 05:09:47.804910] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5b20 blk_sz:0x800 00:19:27.928 [2024-12-15 05:09:47.804917] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x6320 blk_sz:0x800 00:19:27.928 [2024-12-15 05:09:47.804924] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6b20 blk_sz:0x800 00:19:27.928 [2024-12-15 05:09:47.804931] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x7320 blk_sz:0x800 00:19:27.928 [2024-12-15 05:09:47.804938] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7b20 blk_sz:0x40 00:19:27.928 [2024-12-15 05:09:47.804945] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7b60 blk_sz:0x40 00:19:27.928 [2024-12-15 05:09:47.804951] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x7ba0 blk_sz:0x20 00:19:27.928 [2024-12-15 05:09:47.804958] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x7bc0 blk_sz:0x20 00:19:27.928 [2024-12-15 05:09:47.804965] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x7be0 blk_sz:0x20 00:19:27.928 [2024-12-15 05:09:47.804972] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7c00 blk_sz:0x20 00:19:27.928 [2024-12-15 05:09:47.804980] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7c20 blk_sz:0x13b6e0 00:19:27.928 [2024-12-15 05:09:47.804986] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:19:27.928 [2024-12-15 05:09:47.804997] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:19:27.928 [2024-12-15 05:09:47.805007] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:19:27.928 [2024-12-15 05:09:47.805014] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:19:27.928 [2024-12-15 05:09:47.805022] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:19:27.928 [2024-12-15 05:09:47.805029] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:19:27.928 [2024-12-15 05:09:47.805037] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.805045] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:19:27.928 [2024-12-15 05:09:47.805052] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.718 ms 00:19:27.928 [2024-12-15 05:09:47.805059] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.819345] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.819539] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:19:27.928 [2024-12-15 05:09:47.819558] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 14.232 ms 00:19:27.928 [2024-12-15 05:09:47.819567] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.819705] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.819722] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:19:27.928 [2024-12-15 05:09:47.819732] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.065 ms 00:19:27.928 [2024-12-15 05:09:47.819739] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.842692] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.842903] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:19:27.928 [2024-12-15 05:09:47.842928] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 22.928 ms 00:19:27.928 [2024-12-15 05:09:47.842948] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.843064] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.843079] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:19:27.928 [2024-12-15 05:09:47.843091] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:19:27.928 [2024-12-15 05:09:47.843100] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.843675] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.843719] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:19:27.928 [2024-12-15 05:09:47.843734] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.547 ms 00:19:27.928 [2024-12-15 05:09:47.843745] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.843927] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.843943] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:19:27.928 [2024-12-15 05:09:47.843975] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.147 ms 00:19:27.928 [2024-12-15 05:09:47.843985] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.852327] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.852373] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:19:27.928 [2024-12-15 05:09:47.852383] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.311 ms 00:19:27.928 [2024-12-15 05:09:47.852397] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.856257] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 1, empty chunks = 3 00:19:27.928 [2024-12-15 05:09:47.856307] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:19:27.928 [2024-12-15 05:09:47.856320] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.856329] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:19:27.928 [2024-12-15 05:09:47.856338] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.788 ms 00:19:27.928 [2024-12-15 05:09:47.856345] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.872172] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.872218] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:19:27.928 [2024-12-15 05:09:47.872240] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 15.762 ms 00:19:27.928 [2024-12-15 05:09:47.872248] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.874966] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.875013] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:19:27.928 [2024-12-15 05:09:47.875024] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.629 ms 00:19:27.928 [2024-12-15 05:09:47.875031] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.877616] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.877785] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:19:27.928 [2024-12-15 05:09:47.877803] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.494 ms 00:19:27.928 [2024-12-15 05:09:47.877811] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.878147] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.878159] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:19:27.928 [2024-12-15 05:09:47.878168] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.263 ms 00:19:27.928 [2024-12-15 05:09:47.878176] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.903415] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.903493] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:19:27.928 [2024-12-15 05:09:47.903507] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 25.215 ms 00:19:27.928 [2024-12-15 05:09:47.903526] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.911728] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 59 (of 60) MiB 00:19:27.928 [2024-12-15 05:09:47.931014] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.931067] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:19:27.928 [2024-12-15 05:09:47.931080] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 27.387 ms 00:19:27.928 [2024-12-15 05:09:47.931088] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.931178] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.931192] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:19:27.928 [2024-12-15 05:09:47.931203] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.017 ms 00:19:27.928 [2024-12-15 05:09:47.931212] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.931270] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.931280] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:19:27.928 [2024-12-15 05:09:47.931294] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.036 ms 00:19:27.928 [2024-12-15 05:09:47.931302] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.931330] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.931339] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:19:27.928 [2024-12-15 05:09:47.931350] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.010 ms 00:19:27.928 [2024-12-15 05:09:47.931357] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.928 [2024-12-15 05:09:47.931397] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:19:27.928 [2024-12-15 05:09:47.931411] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.928 [2024-12-15 05:09:47.931420] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:19:27.928 [2024-12-15 05:09:47.931428] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.016 ms 00:19:27.928 [2024-12-15 05:09:47.931473] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.929 [2024-12-15 05:09:47.937577] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.929 [2024-12-15 05:09:47.937627] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:19:27.929 [2024-12-15 05:09:47.937648] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.082 ms 00:19:27.929 [2024-12-15 05:09:47.937659] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.929 [2024-12-15 05:09:47.937753] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:27.929 [2024-12-15 05:09:47.937764] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:19:27.929 [2024-12-15 05:09:47.937773] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.038 ms 00:19:27.929 [2024-12-15 05:09:47.937781] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:27.929 [2024-12-15 05:09:47.938772] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:19:27.929 [2024-12-15 05:09:47.940132] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 155.279 ms, result 0 00:19:27.929 [2024-12-15 05:09:47.941552] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:19:27.929 [2024-12-15 05:09:47.948800] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:19:28.190  [2024-12-15T05:09:48.330Z] Copying: 4096/4096 [kB] (average 10 MBps)[2024-12-15 05:09:48.321477] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:19:28.190 [2024-12-15 05:09:48.322523] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.190 [2024-12-15 05:09:48.322565] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:19:28.190 [2024-12-15 05:09:48.322577] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:19:28.190 [2024-12-15 05:09:48.322585] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.190 [2024-12-15 05:09:48.322605] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on ftl_core_thread 00:19:28.190 [2024-12-15 05:09:48.323244] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.190 [2024-12-15 05:09:48.323264] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:19:28.190 [2024-12-15 05:09:48.323273] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.627 ms 00:19:28.190 [2024-12-15 05:09:48.323281] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.190 [2024-12-15 05:09:48.326242] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.190 [2024-12-15 05:09:48.326292] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:19:28.190 [2024-12-15 05:09:48.326303] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.938 ms 00:19:28.190 [2024-12-15 05:09:48.326311] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.452 [2024-12-15 05:09:48.330692] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.452 [2024-12-15 05:09:48.330735] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:19:28.452 [2024-12-15 05:09:48.330747] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.364 ms 00:19:28.452 [2024-12-15 05:09:48.330756] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.452 [2024-12-15 05:09:48.337660] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.452 [2024-12-15 05:09:48.337855] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:19:28.452 [2024-12-15 05:09:48.337881] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.871 ms 00:19:28.452 [2024-12-15 05:09:48.337889] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.452 [2024-12-15 05:09:48.340796] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.452 [2024-12-15 05:09:48.340843] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:19:28.452 [2024-12-15 05:09:48.340853] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.846 ms 00:19:28.452 [2024-12-15 05:09:48.340860] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.452 [2024-12-15 05:09:48.346193] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.452 [2024-12-15 05:09:48.346241] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:19:28.452 [2024-12-15 05:09:48.346251] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.289 ms 00:19:28.452 [2024-12-15 05:09:48.346260] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.452 [2024-12-15 05:09:48.346395] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.452 [2024-12-15 05:09:48.346409] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:19:28.452 [2024-12-15 05:09:48.346417] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.092 ms 00:19:28.452 [2024-12-15 05:09:48.346424] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.452 [2024-12-15 05:09:48.349882] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.452 [2024-12-15 05:09:48.349930] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:19:28.453 [2024-12-15 05:09:48.349939] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.418 ms 00:19:28.453 [2024-12-15 05:09:48.349947] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.453 [2024-12-15 05:09:48.352620] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.453 [2024-12-15 05:09:48.352794] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:19:28.453 [2024-12-15 05:09:48.352811] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.630 ms 00:19:28.453 [2024-12-15 05:09:48.352817] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.453 [2024-12-15 05:09:48.354971] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.453 [2024-12-15 05:09:48.355019] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:19:28.453 [2024-12-15 05:09:48.355029] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.114 ms 00:19:28.453 [2024-12-15 05:09:48.355037] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.453 [2024-12-15 05:09:48.356925] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.453 [2024-12-15 05:09:48.356972] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:19:28.453 [2024-12-15 05:09:48.356981] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.818 ms 00:19:28.453 [2024-12-15 05:09:48.356987] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.453 [2024-12-15 05:09:48.357028] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:19:28.453 [2024-12-15 05:09:48.357042] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357061] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357069] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357077] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357085] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357092] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357100] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357107] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357115] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357122] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357129] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357136] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357143] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357150] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357157] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357164] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357172] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357180] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357187] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357194] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357201] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357208] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357215] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357222] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357229] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357236] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357245] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357252] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357259] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357266] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357274] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357282] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357291] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357299] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357307] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357315] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357322] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357330] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357338] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357345] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357352] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357360] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357367] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357374] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357381] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357389] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357397] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357404] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357411] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357418] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357425] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357454] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357463] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357471] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357479] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357487] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357495] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357503] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357510] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357518] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357525] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357532] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357540] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357547] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357555] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357563] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357571] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357578] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357586] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357593] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357601] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357608] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357615] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357623] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357631] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357638] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357645] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357653] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:19:28.453 [2024-12-15 05:09:48.357660] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357667] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357675] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357682] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357690] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357697] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357707] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357715] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357722] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357729] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357737] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357744] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357752] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357759] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357766] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357774] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357783] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357790] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357817] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357824] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357832] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357840] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:19:28.454 [2024-12-15 05:09:48.357856] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:19:28.454 [2024-12-15 05:09:48.357864] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 3af9b670-83ba-4af2-8311-25d469da50d8 00:19:28.454 [2024-12-15 05:09:48.357872] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:19:28.454 [2024-12-15 05:09:48.357880] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:19:28.454 [2024-12-15 05:09:48.357887] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:19:28.454 [2024-12-15 05:09:48.357895] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:19:28.454 [2024-12-15 05:09:48.357905] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:19:28.454 [2024-12-15 05:09:48.357912] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:19:28.454 [2024-12-15 05:09:48.357919] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:19:28.454 [2024-12-15 05:09:48.357926] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:19:28.454 [2024-12-15 05:09:48.357934] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:19:28.454 [2024-12-15 05:09:48.357941] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.454 [2024-12-15 05:09:48.357949] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:19:28.454 [2024-12-15 05:09:48.357957] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.915 ms 00:19:28.454 [2024-12-15 05:09:48.357965] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.359998] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.454 [2024-12-15 05:09:48.360185] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:19:28.454 [2024-12-15 05:09:48.360211] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.015 ms 00:19:28.454 [2024-12-15 05:09:48.360219] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.360356] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:28.454 [2024-12-15 05:09:48.360366] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:19:28.454 [2024-12-15 05:09:48.360374] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.097 ms 00:19:28.454 [2024-12-15 05:09:48.360383] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.367861] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:28.454 [2024-12-15 05:09:48.367913] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:19:28.454 [2024-12-15 05:09:48.367923] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:28.454 [2024-12-15 05:09:48.367931] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.367994] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:28.454 [2024-12-15 05:09:48.368002] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:19:28.454 [2024-12-15 05:09:48.368011] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:28.454 [2024-12-15 05:09:48.368020] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.368066] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:28.454 [2024-12-15 05:09:48.368088] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:19:28.454 [2024-12-15 05:09:48.368096] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:28.454 [2024-12-15 05:09:48.368108] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.368126] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:28.454 [2024-12-15 05:09:48.368134] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:19:28.454 [2024-12-15 05:09:48.368141] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:28.454 [2024-12-15 05:09:48.368148] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.381808] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:28.454 [2024-12-15 05:09:48.381863] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:19:28.454 [2024-12-15 05:09:48.381883] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:28.454 [2024-12-15 05:09:48.381896] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.392917] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:28.454 [2024-12-15 05:09:48.392976] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:19:28.454 [2024-12-15 05:09:48.392988] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:28.454 [2024-12-15 05:09:48.392997] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.393053] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:28.454 [2024-12-15 05:09:48.393063] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:19:28.454 [2024-12-15 05:09:48.393072] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:28.454 [2024-12-15 05:09:48.393080] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.393120] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:28.454 [2024-12-15 05:09:48.393136] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:19:28.454 [2024-12-15 05:09:48.393144] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:28.454 [2024-12-15 05:09:48.393152] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.393232] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:28.454 [2024-12-15 05:09:48.393242] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:19:28.454 [2024-12-15 05:09:48.393252] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:28.454 [2024-12-15 05:09:48.393259] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.393294] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:28.454 [2024-12-15 05:09:48.393308] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:19:28.454 [2024-12-15 05:09:48.393317] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:28.454 [2024-12-15 05:09:48.393326] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.393371] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:28.454 [2024-12-15 05:09:48.393382] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:19:28.454 [2024-12-15 05:09:48.393390] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:28.454 [2024-12-15 05:09:48.393398] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.393477] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:28.454 [2024-12-15 05:09:48.393489] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:19:28.454 [2024-12-15 05:09:48.393497] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:28.454 [2024-12-15 05:09:48.393505] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:28.454 [2024-12-15 05:09:48.393661] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 71.110 ms, result 0 00:19:28.716 00:19:28.716 00:19:28.716 05:09:48 ftl.ftl_trim -- ftl/trim.sh@93 -- # svcpid=89980 00:19:28.716 05:09:48 ftl.ftl_trim -- ftl/trim.sh@94 -- # waitforlisten 89980 00:19:28.716 05:09:48 ftl.ftl_trim -- ftl/trim.sh@92 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -L ftl_init 00:19:28.716 05:09:48 ftl.ftl_trim -- common/autotest_common.sh@835 -- # '[' -z 89980 ']' 00:19:28.716 05:09:48 ftl.ftl_trim -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:19:28.716 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:19:28.716 05:09:48 ftl.ftl_trim -- common/autotest_common.sh@840 -- # local max_retries=100 00:19:28.716 05:09:48 ftl.ftl_trim -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:19:28.716 05:09:48 ftl.ftl_trim -- common/autotest_common.sh@844 -- # xtrace_disable 00:19:28.716 05:09:48 ftl.ftl_trim -- common/autotest_common.sh@10 -- # set +x 00:19:28.716 [2024-12-15 05:09:48.695098] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:19:28.716 [2024-12-15 05:09:48.695247] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid89980 ] 00:19:28.977 [2024-12-15 05:09:48.858997] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:28.977 [2024-12-15 05:09:48.889231] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:19:29.549 05:09:49 ftl.ftl_trim -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:19:29.549 05:09:49 ftl.ftl_trim -- common/autotest_common.sh@868 -- # return 0 00:19:29.549 05:09:49 ftl.ftl_trim -- ftl/trim.sh@96 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py load_config 00:19:29.810 [2024-12-15 05:09:49.787899] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:19:29.810 [2024-12-15 05:09:49.787991] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:19:30.072 [2024-12-15 05:09:49.965148] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.072 [2024-12-15 05:09:49.965212] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:19:30.072 [2024-12-15 05:09:49.965228] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:19:30.072 [2024-12-15 05:09:49.965238] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.072 [2024-12-15 05:09:49.967868] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.072 [2024-12-15 05:09:49.968059] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:19:30.072 [2024-12-15 05:09:49.968104] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.610 ms 00:19:30.072 [2024-12-15 05:09:49.968120] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.072 [2024-12-15 05:09:49.968230] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:19:30.072 [2024-12-15 05:09:49.968520] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:19:30.072 [2024-12-15 05:09:49.968540] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.072 [2024-12-15 05:09:49.968551] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:19:30.072 [2024-12-15 05:09:49.968561] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.320 ms 00:19:30.072 [2024-12-15 05:09:49.968572] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.072 [2024-12-15 05:09:49.970262] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 0, shm_clean 0 00:19:30.072 [2024-12-15 05:09:49.974121] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.072 [2024-12-15 05:09:49.974169] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:19:30.072 [2024-12-15 05:09:49.974183] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.855 ms 00:19:30.072 [2024-12-15 05:09:49.974191] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.072 [2024-12-15 05:09:49.974271] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.072 [2024-12-15 05:09:49.974282] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:19:30.072 [2024-12-15 05:09:49.974296] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.026 ms 00:19:30.072 [2024-12-15 05:09:49.974304] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.072 [2024-12-15 05:09:49.982509] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.072 [2024-12-15 05:09:49.982549] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:19:30.072 [2024-12-15 05:09:49.982562] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.143 ms 00:19:30.072 [2024-12-15 05:09:49.982569] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.072 [2024-12-15 05:09:49.982684] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.072 [2024-12-15 05:09:49.982694] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:19:30.072 [2024-12-15 05:09:49.982711] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.057 ms 00:19:30.072 [2024-12-15 05:09:49.982723] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.072 [2024-12-15 05:09:49.982753] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.072 [2024-12-15 05:09:49.982764] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:19:30.072 [2024-12-15 05:09:49.982774] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.010 ms 00:19:30.072 [2024-12-15 05:09:49.982781] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.072 [2024-12-15 05:09:49.982809] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on ftl_core_thread 00:19:30.072 [2024-12-15 05:09:49.984996] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.072 [2024-12-15 05:09:49.985038] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:19:30.072 [2024-12-15 05:09:49.985052] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.194 ms 00:19:30.072 [2024-12-15 05:09:49.985061] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.072 [2024-12-15 05:09:49.985107] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.072 [2024-12-15 05:09:49.985121] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:19:30.072 [2024-12-15 05:09:49.985132] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.014 ms 00:19:30.072 [2024-12-15 05:09:49.985142] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.072 [2024-12-15 05:09:49.985164] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:19:30.072 [2024-12-15 05:09:49.985188] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:19:30.072 [2024-12-15 05:09:49.985229] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:19:30.072 [2024-12-15 05:09:49.985251] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:19:30.072 [2024-12-15 05:09:49.985357] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:19:30.072 [2024-12-15 05:09:49.985370] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:19:30.072 [2024-12-15 05:09:49.985381] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:19:30.072 [2024-12-15 05:09:49.985393] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:19:30.072 [2024-12-15 05:09:49.985405] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:19:30.072 [2024-12-15 05:09:49.985418] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 23592960 00:19:30.072 [2024-12-15 05:09:49.985429] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:19:30.072 [2024-12-15 05:09:49.985461] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:19:30.072 [2024-12-15 05:09:49.985470] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:19:30.072 [2024-12-15 05:09:49.985483] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.072 [2024-12-15 05:09:49.985491] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:19:30.072 [2024-12-15 05:09:49.985501] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.320 ms 00:19:30.072 [2024-12-15 05:09:49.985508] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.072 [2024-12-15 05:09:49.985614] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.072 [2024-12-15 05:09:49.985624] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:19:30.072 [2024-12-15 05:09:49.985635] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.085 ms 00:19:30.072 [2024-12-15 05:09:49.985643] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.072 [2024-12-15 05:09:49.985756] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:19:30.072 [2024-12-15 05:09:49.985768] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:19:30.072 [2024-12-15 05:09:49.985780] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:19:30.072 [2024-12-15 05:09:49.985792] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:30.072 [2024-12-15 05:09:49.985807] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:19:30.072 [2024-12-15 05:09:49.985821] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:19:30.072 [2024-12-15 05:09:49.985837] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 90.00 MiB 00:19:30.072 [2024-12-15 05:09:49.985846] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:19:30.072 [2024-12-15 05:09:49.985855] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 90.12 MiB 00:19:30.072 [2024-12-15 05:09:49.985863] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:19:30.073 [2024-12-15 05:09:49.985873] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:19:30.073 [2024-12-15 05:09:49.985881] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 90.62 MiB 00:19:30.073 [2024-12-15 05:09:49.985891] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:19:30.073 [2024-12-15 05:09:49.985899] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:19:30.073 [2024-12-15 05:09:49.985908] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.88 MiB 00:19:30.073 [2024-12-15 05:09:49.985916] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:30.073 [2024-12-15 05:09:49.985930] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:19:30.073 [2024-12-15 05:09:49.985939] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 124.00 MiB 00:19:30.073 [2024-12-15 05:09:49.985949] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:30.073 [2024-12-15 05:09:49.985957] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:19:30.073 [2024-12-15 05:09:49.985970] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 91.12 MiB 00:19:30.073 [2024-12-15 05:09:49.985978] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:30.073 [2024-12-15 05:09:49.985988] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:19:30.073 [2024-12-15 05:09:49.985996] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 99.12 MiB 00:19:30.073 [2024-12-15 05:09:49.986005] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:30.073 [2024-12-15 05:09:49.986014] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:19:30.073 [2024-12-15 05:09:49.986023] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 107.12 MiB 00:19:30.073 [2024-12-15 05:09:49.986029] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:30.073 [2024-12-15 05:09:49.986038] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:19:30.073 [2024-12-15 05:09:49.986044] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 115.12 MiB 00:19:30.073 [2024-12-15 05:09:49.986052] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:30.073 [2024-12-15 05:09:49.986059] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:19:30.073 [2024-12-15 05:09:49.986067] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.12 MiB 00:19:30.073 [2024-12-15 05:09:49.986073] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:19:30.073 [2024-12-15 05:09:49.986081] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:19:30.073 [2024-12-15 05:09:49.986088] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.38 MiB 00:19:30.073 [2024-12-15 05:09:49.986099] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:19:30.073 [2024-12-15 05:09:49.986105] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:19:30.073 [2024-12-15 05:09:49.986113] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.62 MiB 00:19:30.073 [2024-12-15 05:09:49.986119] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:30.073 [2024-12-15 05:09:49.986128] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:19:30.073 [2024-12-15 05:09:49.986135] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.75 MiB 00:19:30.073 [2024-12-15 05:09:49.986143] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:30.073 [2024-12-15 05:09:49.986150] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:19:30.073 [2024-12-15 05:09:49.986159] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:19:30.073 [2024-12-15 05:09:49.986167] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:19:30.073 [2024-12-15 05:09:49.986180] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:30.073 [2024-12-15 05:09:49.986188] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:19:30.073 [2024-12-15 05:09:49.986197] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:19:30.073 [2024-12-15 05:09:49.986206] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:19:30.073 [2024-12-15 05:09:49.986216] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:19:30.073 [2024-12-15 05:09:49.986223] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:19:30.073 [2024-12-15 05:09:49.986233] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:19:30.073 [2024-12-15 05:09:49.986242] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:19:30.073 [2024-12-15 05:09:49.986254] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:19:30.073 [2024-12-15 05:09:49.986265] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5a00 00:19:30.073 [2024-12-15 05:09:49.986275] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5a20 blk_sz:0x80 00:19:30.073 [2024-12-15 05:09:49.986282] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x5aa0 blk_sz:0x80 00:19:30.073 [2024-12-15 05:09:49.986292] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5b20 blk_sz:0x800 00:19:30.073 [2024-12-15 05:09:49.986300] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x6320 blk_sz:0x800 00:19:30.073 [2024-12-15 05:09:49.986309] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6b20 blk_sz:0x800 00:19:30.073 [2024-12-15 05:09:49.986316] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x7320 blk_sz:0x800 00:19:30.073 [2024-12-15 05:09:49.986325] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7b20 blk_sz:0x40 00:19:30.073 [2024-12-15 05:09:49.986333] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7b60 blk_sz:0x40 00:19:30.073 [2024-12-15 05:09:49.986341] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x7ba0 blk_sz:0x20 00:19:30.073 [2024-12-15 05:09:49.986348] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x7bc0 blk_sz:0x20 00:19:30.073 [2024-12-15 05:09:49.986358] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x7be0 blk_sz:0x20 00:19:30.073 [2024-12-15 05:09:49.986364] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7c00 blk_sz:0x20 00:19:30.073 [2024-12-15 05:09:49.986375] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7c20 blk_sz:0x13b6e0 00:19:30.073 [2024-12-15 05:09:49.986382] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:19:30.073 [2024-12-15 05:09:49.986393] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:19:30.073 [2024-12-15 05:09:49.986401] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:19:30.073 [2024-12-15 05:09:49.986411] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:19:30.073 [2024-12-15 05:09:49.986418] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:19:30.073 [2024-12-15 05:09:49.986427] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:19:30.073 [2024-12-15 05:09:49.986477] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.073 [2024-12-15 05:09:49.986489] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:19:30.073 [2024-12-15 05:09:49.986501] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.791 ms 00:19:30.073 [2024-12-15 05:09:49.986514] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.073 [2024-12-15 05:09:50.001185] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.073 [2024-12-15 05:09:50.001393] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:19:30.073 [2024-12-15 05:09:50.001413] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 14.610 ms 00:19:30.073 [2024-12-15 05:09:50.001424] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.073 [2024-12-15 05:09:50.001594] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.073 [2024-12-15 05:09:50.001612] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:19:30.073 [2024-12-15 05:09:50.001622] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.069 ms 00:19:30.073 [2024-12-15 05:09:50.001632] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.073 [2024-12-15 05:09:50.014624] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.073 [2024-12-15 05:09:50.014673] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:19:30.073 [2024-12-15 05:09:50.014684] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 12.965 ms 00:19:30.073 [2024-12-15 05:09:50.014697] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.073 [2024-12-15 05:09:50.014764] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.073 [2024-12-15 05:09:50.014776] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:19:30.073 [2024-12-15 05:09:50.014785] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:19:30.073 [2024-12-15 05:09:50.014795] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.073 [2024-12-15 05:09:50.015336] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.073 [2024-12-15 05:09:50.015359] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:19:30.073 [2024-12-15 05:09:50.015372] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.519 ms 00:19:30.073 [2024-12-15 05:09:50.015383] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.073 [2024-12-15 05:09:50.015559] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.073 [2024-12-15 05:09:50.015629] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:19:30.073 [2024-12-15 05:09:50.015642] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.145 ms 00:19:30.073 [2024-12-15 05:09:50.015653] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.073 [2024-12-15 05:09:50.024256] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.073 [2024-12-15 05:09:50.024306] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:19:30.073 [2024-12-15 05:09:50.024317] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.578 ms 00:19:30.073 [2024-12-15 05:09:50.024326] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.073 [2024-12-15 05:09:50.041567] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 2, empty chunks = 2 00:19:30.073 [2024-12-15 05:09:50.041677] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:19:30.073 [2024-12-15 05:09:50.041712] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.074 [2024-12-15 05:09:50.041739] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:19:30.074 [2024-12-15 05:09:50.041766] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 17.287 ms 00:19:30.074 [2024-12-15 05:09:50.041790] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.074 [2024-12-15 05:09:50.060199] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.074 [2024-12-15 05:09:50.060253] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:19:30.074 [2024-12-15 05:09:50.060265] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 18.297 ms 00:19:30.074 [2024-12-15 05:09:50.060277] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.074 [2024-12-15 05:09:50.063207] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.074 [2024-12-15 05:09:50.063259] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:19:30.074 [2024-12-15 05:09:50.063269] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.816 ms 00:19:30.074 [2024-12-15 05:09:50.063279] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.074 [2024-12-15 05:09:50.065840] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.074 [2024-12-15 05:09:50.065888] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:19:30.074 [2024-12-15 05:09:50.065897] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.512 ms 00:19:30.074 [2024-12-15 05:09:50.065906] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.074 [2024-12-15 05:09:50.066260] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.074 [2024-12-15 05:09:50.066274] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:19:30.074 [2024-12-15 05:09:50.066284] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.273 ms 00:19:30.074 [2024-12-15 05:09:50.066293] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.074 [2024-12-15 05:09:50.090738] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.074 [2024-12-15 05:09:50.090807] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:19:30.074 [2024-12-15 05:09:50.090820] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 24.421 ms 00:19:30.074 [2024-12-15 05:09:50.090839] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.074 [2024-12-15 05:09:50.099098] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 59 (of 60) MiB 00:19:30.074 [2024-12-15 05:09:50.118146] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.074 [2024-12-15 05:09:50.118196] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:19:30.074 [2024-12-15 05:09:50.118211] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 27.205 ms 00:19:30.074 [2024-12-15 05:09:50.118225] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.074 [2024-12-15 05:09:50.118315] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.074 [2024-12-15 05:09:50.118329] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:19:30.074 [2024-12-15 05:09:50.118341] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.013 ms 00:19:30.074 [2024-12-15 05:09:50.118350] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.074 [2024-12-15 05:09:50.118411] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.074 [2024-12-15 05:09:50.118420] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:19:30.074 [2024-12-15 05:09:50.118450] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.038 ms 00:19:30.074 [2024-12-15 05:09:50.118459] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.074 [2024-12-15 05:09:50.118487] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.074 [2024-12-15 05:09:50.118496] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:19:30.074 [2024-12-15 05:09:50.118515] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:19:30.074 [2024-12-15 05:09:50.118523] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.074 [2024-12-15 05:09:50.118562] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:19:30.074 [2024-12-15 05:09:50.118572] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.074 [2024-12-15 05:09:50.118582] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:19:30.074 [2024-12-15 05:09:50.118590] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.014 ms 00:19:30.074 [2024-12-15 05:09:50.118600] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.074 [2024-12-15 05:09:50.124469] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.074 [2024-12-15 05:09:50.124522] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:19:30.074 [2024-12-15 05:09:50.124534] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.844 ms 00:19:30.074 [2024-12-15 05:09:50.124547] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.074 [2024-12-15 05:09:50.124637] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.074 [2024-12-15 05:09:50.124650] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:19:30.074 [2024-12-15 05:09:50.124660] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.039 ms 00:19:30.074 [2024-12-15 05:09:50.124670] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.074 [2024-12-15 05:09:50.125725] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:19:30.074 [2024-12-15 05:09:50.127039] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 160.250 ms, result 0 00:19:30.074 [2024-12-15 05:09:50.129111] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:19:30.074 Some configs were skipped because the RPC state that can call them passed over. 00:19:30.074 05:09:50 ftl.ftl_trim -- ftl/trim.sh@99 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_unmap -b ftl0 --lba 0 --num_blocks 1024 00:19:30.335 [2024-12-15 05:09:50.367058] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.335 [2024-12-15 05:09:50.367231] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Process trim 00:19:30.335 [2024-12-15 05:09:50.367299] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.166 ms 00:19:30.335 [2024-12-15 05:09:50.367325] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.335 [2024-12-15 05:09:50.367383] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL trim', duration = 3.499 ms, result 0 00:19:30.335 true 00:19:30.335 05:09:50 ftl.ftl_trim -- ftl/trim.sh@100 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_unmap -b ftl0 --lba 23591936 --num_blocks 1024 00:19:30.596 [2024-12-15 05:09:50.582944] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.596 [2024-12-15 05:09:50.583130] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Process trim 00:19:30.596 [2024-12-15 05:09:50.583203] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.788 ms 00:19:30.596 [2024-12-15 05:09:50.583229] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.596 [2024-12-15 05:09:50.583286] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL trim', duration = 3.129 ms, result 0 00:19:30.596 true 00:19:30.596 05:09:50 ftl.ftl_trim -- ftl/trim.sh@102 -- # killprocess 89980 00:19:30.596 05:09:50 ftl.ftl_trim -- common/autotest_common.sh@954 -- # '[' -z 89980 ']' 00:19:30.596 05:09:50 ftl.ftl_trim -- common/autotest_common.sh@958 -- # kill -0 89980 00:19:30.596 05:09:50 ftl.ftl_trim -- common/autotest_common.sh@959 -- # uname 00:19:30.596 05:09:50 ftl.ftl_trim -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:19:30.596 05:09:50 ftl.ftl_trim -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 89980 00:19:30.596 killing process with pid 89980 00:19:30.596 05:09:50 ftl.ftl_trim -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:19:30.596 05:09:50 ftl.ftl_trim -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:19:30.596 05:09:50 ftl.ftl_trim -- common/autotest_common.sh@972 -- # echo 'killing process with pid 89980' 00:19:30.596 05:09:50 ftl.ftl_trim -- common/autotest_common.sh@973 -- # kill 89980 00:19:30.596 05:09:50 ftl.ftl_trim -- common/autotest_common.sh@978 -- # wait 89980 00:19:30.858 [2024-12-15 05:09:50.773874] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.858 [2024-12-15 05:09:50.773946] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:19:30.858 [2024-12-15 05:09:50.773964] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:19:30.858 [2024-12-15 05:09:50.773979] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.858 [2024-12-15 05:09:50.774008] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on ftl_core_thread 00:19:30.858 [2024-12-15 05:09:50.774775] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.858 [2024-12-15 05:09:50.774822] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:19:30.858 [2024-12-15 05:09:50.774837] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.751 ms 00:19:30.858 [2024-12-15 05:09:50.774848] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.858 [2024-12-15 05:09:50.775145] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.858 [2024-12-15 05:09:50.775165] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:19:30.858 [2024-12-15 05:09:50.775174] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.266 ms 00:19:30.858 [2024-12-15 05:09:50.775184] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.858 [2024-12-15 05:09:50.779730] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.858 [2024-12-15 05:09:50.779777] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:19:30.858 [2024-12-15 05:09:50.779788] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.528 ms 00:19:30.858 [2024-12-15 05:09:50.779804] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.858 [2024-12-15 05:09:50.786758] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.858 [2024-12-15 05:09:50.786803] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:19:30.858 [2024-12-15 05:09:50.786813] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.912 ms 00:19:30.858 [2024-12-15 05:09:50.786825] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.858 [2024-12-15 05:09:50.789865] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.858 [2024-12-15 05:09:50.789919] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:19:30.858 [2024-12-15 05:09:50.789930] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.971 ms 00:19:30.858 [2024-12-15 05:09:50.789939] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.858 [2024-12-15 05:09:50.795638] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.858 [2024-12-15 05:09:50.795691] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:19:30.858 [2024-12-15 05:09:50.795704] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.651 ms 00:19:30.858 [2024-12-15 05:09:50.795713] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.858 [2024-12-15 05:09:50.795857] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.858 [2024-12-15 05:09:50.795870] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:19:30.858 [2024-12-15 05:09:50.795879] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.095 ms 00:19:30.858 [2024-12-15 05:09:50.795889] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.858 [2024-12-15 05:09:50.799356] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.858 [2024-12-15 05:09:50.799408] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:19:30.858 [2024-12-15 05:09:50.799418] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.448 ms 00:19:30.858 [2024-12-15 05:09:50.799454] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.858 [2024-12-15 05:09:50.802224] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.858 [2024-12-15 05:09:50.802275] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:19:30.858 [2024-12-15 05:09:50.802286] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.697 ms 00:19:30.858 [2024-12-15 05:09:50.802295] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.858 [2024-12-15 05:09:50.804476] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.858 [2024-12-15 05:09:50.804525] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:19:30.858 [2024-12-15 05:09:50.804534] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.139 ms 00:19:30.858 [2024-12-15 05:09:50.804543] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.858 [2024-12-15 05:09:50.806781] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.858 [2024-12-15 05:09:50.806833] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:19:30.858 [2024-12-15 05:09:50.806842] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.163 ms 00:19:30.858 [2024-12-15 05:09:50.806851] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.858 [2024-12-15 05:09:50.806892] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:19:30.858 [2024-12-15 05:09:50.806909] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.806919] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.806933] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.806940] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.806950] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.806958] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.806968] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.806975] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.806985] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.806992] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807001] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807009] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807020] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807027] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807036] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807044] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807053] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807061] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807073] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807080] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807090] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807097] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807106] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807114] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807123] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807130] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807141] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807148] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807158] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807166] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807175] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807183] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807194] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807202] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807214] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807221] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807231] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807238] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807249] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:19:30.858 [2024-12-15 05:09:50.807257] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807266] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807274] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807283] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807290] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807300] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807307] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807316] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807324] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807333] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807340] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807351] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807358] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807367] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807375] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807384] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807393] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807402] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807409] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807418] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807425] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807459] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807468] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807477] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807485] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807497] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807505] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807519] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807526] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807536] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807544] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807553] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807561] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807571] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807578] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807589] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807596] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807606] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807613] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807623] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807630] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807640] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807654] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807665] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807673] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807683] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807690] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807699] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807706] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807716] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807723] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807733] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807741] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807750] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807757] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807766] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807774] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807784] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807792] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807804] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807811] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:19:30.859 [2024-12-15 05:09:50.807829] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:19:30.859 [2024-12-15 05:09:50.807838] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 3af9b670-83ba-4af2-8311-25d469da50d8 00:19:30.859 [2024-12-15 05:09:50.807850] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:19:30.859 [2024-12-15 05:09:50.807858] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:19:30.859 [2024-12-15 05:09:50.807867] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:19:30.859 [2024-12-15 05:09:50.807876] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:19:30.859 [2024-12-15 05:09:50.807885] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:19:30.859 [2024-12-15 05:09:50.807896] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:19:30.859 [2024-12-15 05:09:50.807905] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:19:30.859 [2024-12-15 05:09:50.807912] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:19:30.859 [2024-12-15 05:09:50.807920] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:19:30.859 [2024-12-15 05:09:50.807927] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.859 [2024-12-15 05:09:50.807936] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:19:30.859 [2024-12-15 05:09:50.807953] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.036 ms 00:19:30.859 [2024-12-15 05:09:50.807964] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.859 [2024-12-15 05:09:50.810500] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.859 [2024-12-15 05:09:50.810636] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:19:30.859 [2024-12-15 05:09:50.810699] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.500 ms 00:19:30.859 [2024-12-15 05:09:50.810728] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.859 [2024-12-15 05:09:50.810888] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:30.859 [2024-12-15 05:09:50.811005] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:19:30.859 [2024-12-15 05:09:50.811184] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.101 ms 00:19:30.859 [2024-12-15 05:09:50.811229] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.859 [2024-12-15 05:09:50.819065] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:30.859 [2024-12-15 05:09:50.819236] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:19:30.859 [2024-12-15 05:09:50.819292] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:30.859 [2024-12-15 05:09:50.819317] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.859 [2024-12-15 05:09:50.819426] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:30.859 [2024-12-15 05:09:50.819497] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:19:30.859 [2024-12-15 05:09:50.819518] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:30.859 [2024-12-15 05:09:50.819548] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.859 [2024-12-15 05:09:50.819683] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:30.859 [2024-12-15 05:09:50.819714] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:19:30.859 [2024-12-15 05:09:50.819736] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:30.859 [2024-12-15 05:09:50.819757] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.859 [2024-12-15 05:09:50.819789] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:30.859 [2024-12-15 05:09:50.819852] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:19:30.859 [2024-12-15 05:09:50.819877] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:30.860 [2024-12-15 05:09:50.819899] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.860 [2024-12-15 05:09:50.834118] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:30.860 [2024-12-15 05:09:50.834315] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:19:30.860 [2024-12-15 05:09:50.834371] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:30.860 [2024-12-15 05:09:50.834396] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.860 [2024-12-15 05:09:50.845333] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:30.860 [2024-12-15 05:09:50.845537] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:19:30.860 [2024-12-15 05:09:50.845597] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:30.860 [2024-12-15 05:09:50.845626] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.860 [2024-12-15 05:09:50.845713] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:30.860 [2024-12-15 05:09:50.845749] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:19:30.860 [2024-12-15 05:09:50.845769] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:30.860 [2024-12-15 05:09:50.845791] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.860 [2024-12-15 05:09:50.845837] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:30.860 [2024-12-15 05:09:50.845917] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:19:30.860 [2024-12-15 05:09:50.845942] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:30.860 [2024-12-15 05:09:50.845964] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.860 [2024-12-15 05:09:50.846062] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:30.860 [2024-12-15 05:09:50.846093] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:19:30.860 [2024-12-15 05:09:50.846114] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:30.860 [2024-12-15 05:09:50.846125] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.860 [2024-12-15 05:09:50.846161] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:30.860 [2024-12-15 05:09:50.846172] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:19:30.860 [2024-12-15 05:09:50.846181] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:30.860 [2024-12-15 05:09:50.846194] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.860 [2024-12-15 05:09:50.846240] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:30.860 [2024-12-15 05:09:50.846252] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:19:30.860 [2024-12-15 05:09:50.846261] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:30.860 [2024-12-15 05:09:50.846272] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.860 [2024-12-15 05:09:50.846323] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:30.860 [2024-12-15 05:09:50.846336] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:19:30.860 [2024-12-15 05:09:50.846344] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:30.860 [2024-12-15 05:09:50.846353] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:30.860 [2024-12-15 05:09:50.846526] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 72.623 ms, result 0 00:19:31.120 05:09:51 ftl.ftl_trim -- ftl/trim.sh@105 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=ftl0 --of=/home/vagrant/spdk_repo/spdk/test/ftl/data --count=65536 --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:19:31.120 [2024-12-15 05:09:51.160861] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:19:31.120 [2024-12-15 05:09:51.161007] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid90016 ] 00:19:31.379 [2024-12-15 05:09:51.322634] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:31.379 [2024-12-15 05:09:51.350752] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:19:31.379 [2024-12-15 05:09:51.466639] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:19:31.379 [2024-12-15 05:09:51.466732] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:19:31.642 [2024-12-15 05:09:51.627461] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.642 [2024-12-15 05:09:51.627517] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:19:31.642 [2024-12-15 05:09:51.627532] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.021 ms 00:19:31.642 [2024-12-15 05:09:51.627541] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.642 [2024-12-15 05:09:51.630096] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.642 [2024-12-15 05:09:51.630291] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:19:31.642 [2024-12-15 05:09:51.630318] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.533 ms 00:19:31.642 [2024-12-15 05:09:51.630327] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.642 [2024-12-15 05:09:51.630788] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:19:31.642 [2024-12-15 05:09:51.631118] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:19:31.642 [2024-12-15 05:09:51.631150] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.642 [2024-12-15 05:09:51.631163] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:19:31.642 [2024-12-15 05:09:51.631175] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.392 ms 00:19:31.642 [2024-12-15 05:09:51.631182] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.642 [2024-12-15 05:09:51.633045] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 0, shm_clean 0 00:19:31.642 [2024-12-15 05:09:51.636906] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.642 [2024-12-15 05:09:51.637072] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:19:31.642 [2024-12-15 05:09:51.637150] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.863 ms 00:19:31.642 [2024-12-15 05:09:51.637175] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.642 [2024-12-15 05:09:51.637336] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.642 [2024-12-15 05:09:51.638080] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:19:31.642 [2024-12-15 05:09:51.638119] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.030 ms 00:19:31.642 [2024-12-15 05:09:51.638138] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.642 [2024-12-15 05:09:51.646060] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.642 [2024-12-15 05:09:51.646105] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:19:31.642 [2024-12-15 05:09:51.646116] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 7.835 ms 00:19:31.642 [2024-12-15 05:09:51.646124] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.642 [2024-12-15 05:09:51.646285] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.642 [2024-12-15 05:09:51.646300] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:19:31.642 [2024-12-15 05:09:51.646309] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.075 ms 00:19:31.642 [2024-12-15 05:09:51.646325] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.642 [2024-12-15 05:09:51.646352] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.642 [2024-12-15 05:09:51.646361] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:19:31.642 [2024-12-15 05:09:51.646370] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.008 ms 00:19:31.642 [2024-12-15 05:09:51.646378] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.643 [2024-12-15 05:09:51.646401] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on ftl_core_thread 00:19:31.643 [2024-12-15 05:09:51.648475] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.643 [2024-12-15 05:09:51.648508] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:19:31.643 [2024-12-15 05:09:51.648519] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.081 ms 00:19:31.643 [2024-12-15 05:09:51.648532] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.643 [2024-12-15 05:09:51.648577] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.643 [2024-12-15 05:09:51.648591] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:19:31.643 [2024-12-15 05:09:51.648599] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.013 ms 00:19:31.643 [2024-12-15 05:09:51.648607] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.643 [2024-12-15 05:09:51.648626] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:19:31.643 [2024-12-15 05:09:51.648654] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:19:31.643 [2024-12-15 05:09:51.648692] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:19:31.643 [2024-12-15 05:09:51.648713] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:19:31.643 [2024-12-15 05:09:51.648821] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:19:31.643 [2024-12-15 05:09:51.648832] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:19:31.643 [2024-12-15 05:09:51.648843] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:19:31.643 [2024-12-15 05:09:51.648854] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:19:31.643 [2024-12-15 05:09:51.648863] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:19:31.643 [2024-12-15 05:09:51.648871] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 23592960 00:19:31.643 [2024-12-15 05:09:51.648878] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:19:31.643 [2024-12-15 05:09:51.648886] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:19:31.643 [2024-12-15 05:09:51.648896] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:19:31.643 [2024-12-15 05:09:51.648907] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.643 [2024-12-15 05:09:51.648915] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:19:31.643 [2024-12-15 05:09:51.648922] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.281 ms 00:19:31.643 [2024-12-15 05:09:51.648930] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.643 [2024-12-15 05:09:51.649018] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.643 [2024-12-15 05:09:51.649030] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:19:31.643 [2024-12-15 05:09:51.649042] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.069 ms 00:19:31.643 [2024-12-15 05:09:51.649053] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.643 [2024-12-15 05:09:51.649155] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:19:31.643 [2024-12-15 05:09:51.649177] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:19:31.643 [2024-12-15 05:09:51.649186] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:19:31.643 [2024-12-15 05:09:51.649195] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:31.643 [2024-12-15 05:09:51.649204] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:19:31.643 [2024-12-15 05:09:51.649212] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:19:31.643 [2024-12-15 05:09:51.649221] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 90.00 MiB 00:19:31.643 [2024-12-15 05:09:51.649232] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:19:31.643 [2024-12-15 05:09:51.649240] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 90.12 MiB 00:19:31.643 [2024-12-15 05:09:51.649248] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:19:31.643 [2024-12-15 05:09:51.649257] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:19:31.643 [2024-12-15 05:09:51.649264] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 90.62 MiB 00:19:31.643 [2024-12-15 05:09:51.649272] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:19:31.643 [2024-12-15 05:09:51.649279] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:19:31.643 [2024-12-15 05:09:51.649288] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.88 MiB 00:19:31.643 [2024-12-15 05:09:51.649297] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:31.643 [2024-12-15 05:09:51.649304] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:19:31.643 [2024-12-15 05:09:51.649313] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 124.00 MiB 00:19:31.643 [2024-12-15 05:09:51.649321] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:31.643 [2024-12-15 05:09:51.649328] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:19:31.643 [2024-12-15 05:09:51.649336] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 91.12 MiB 00:19:31.643 [2024-12-15 05:09:51.649345] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:31.643 [2024-12-15 05:09:51.649352] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:19:31.643 [2024-12-15 05:09:51.649365] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 99.12 MiB 00:19:31.643 [2024-12-15 05:09:51.649373] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:31.643 [2024-12-15 05:09:51.649380] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:19:31.643 [2024-12-15 05:09:51.649388] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 107.12 MiB 00:19:31.643 [2024-12-15 05:09:51.649398] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:31.643 [2024-12-15 05:09:51.649407] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:19:31.643 [2024-12-15 05:09:51.649414] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 115.12 MiB 00:19:31.643 [2024-12-15 05:09:51.649422] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:31.643 [2024-12-15 05:09:51.649430] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:19:31.643 [2024-12-15 05:09:51.649670] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.12 MiB 00:19:31.643 [2024-12-15 05:09:51.649690] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:19:31.643 [2024-12-15 05:09:51.649709] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:19:31.643 [2024-12-15 05:09:51.649727] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.38 MiB 00:19:31.643 [2024-12-15 05:09:51.649746] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:19:31.643 [2024-12-15 05:09:51.649764] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:19:31.643 [2024-12-15 05:09:51.649782] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.62 MiB 00:19:31.643 [2024-12-15 05:09:51.649805] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:31.643 [2024-12-15 05:09:51.649823] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:19:31.643 [2024-12-15 05:09:51.649841] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 123.75 MiB 00:19:31.643 [2024-12-15 05:09:51.649859] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:31.643 [2024-12-15 05:09:51.649937] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:19:31.643 [2024-12-15 05:09:51.649962] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:19:31.643 [2024-12-15 05:09:51.649982] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:19:31.643 [2024-12-15 05:09:51.650002] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:31.643 [2024-12-15 05:09:51.650022] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:19:31.643 [2024-12-15 05:09:51.650040] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:19:31.643 [2024-12-15 05:09:51.650058] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:19:31.643 [2024-12-15 05:09:51.650076] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:19:31.643 [2024-12-15 05:09:51.650094] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:19:31.643 [2024-12-15 05:09:51.650113] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:19:31.643 [2024-12-15 05:09:51.650133] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:19:31.643 [2024-12-15 05:09:51.650174] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:19:31.643 [2024-12-15 05:09:51.650248] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5a00 00:19:31.643 [2024-12-15 05:09:51.650303] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5a20 blk_sz:0x80 00:19:31.643 [2024-12-15 05:09:51.650719] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x5aa0 blk_sz:0x80 00:19:31.643 [2024-12-15 05:09:51.650763] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5b20 blk_sz:0x800 00:19:31.643 [2024-12-15 05:09:51.650793] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x6320 blk_sz:0x800 00:19:31.643 [2024-12-15 05:09:51.650823] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6b20 blk_sz:0x800 00:19:31.643 [2024-12-15 05:09:51.650844] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x7320 blk_sz:0x800 00:19:31.643 [2024-12-15 05:09:51.650853] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7b20 blk_sz:0x40 00:19:31.643 [2024-12-15 05:09:51.650861] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7b60 blk_sz:0x40 00:19:31.643 [2024-12-15 05:09:51.650868] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x7ba0 blk_sz:0x20 00:19:31.643 [2024-12-15 05:09:51.650876] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x7bc0 blk_sz:0x20 00:19:31.643 [2024-12-15 05:09:51.650884] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x7be0 blk_sz:0x20 00:19:31.643 [2024-12-15 05:09:51.650890] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7c00 blk_sz:0x20 00:19:31.643 [2024-12-15 05:09:51.650898] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7c20 blk_sz:0x13b6e0 00:19:31.643 [2024-12-15 05:09:51.650906] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:19:31.643 [2024-12-15 05:09:51.650923] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:19:31.643 [2024-12-15 05:09:51.650934] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:19:31.644 [2024-12-15 05:09:51.650943] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:19:31.644 [2024-12-15 05:09:51.650950] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:19:31.644 [2024-12-15 05:09:51.650958] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:19:31.644 [2024-12-15 05:09:51.650969] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.650979] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:19:31.644 [2024-12-15 05:09:51.650989] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.883 ms 00:19:31.644 [2024-12-15 05:09:51.650997] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.664708] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.664756] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:19:31.644 [2024-12-15 05:09:51.664777] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 13.611 ms 00:19:31.644 [2024-12-15 05:09:51.664785] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.664922] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.664941] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:19:31.644 [2024-12-15 05:09:51.664950] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.066 ms 00:19:31.644 [2024-12-15 05:09:51.664957] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.686406] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.686484] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:19:31.644 [2024-12-15 05:09:51.686499] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 21.426 ms 00:19:31.644 [2024-12-15 05:09:51.686508] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.686636] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.686656] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:19:31.644 [2024-12-15 05:09:51.686670] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:19:31.644 [2024-12-15 05:09:51.686679] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.687211] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.687255] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:19:31.644 [2024-12-15 05:09:51.687276] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.507 ms 00:19:31.644 [2024-12-15 05:09:51.687286] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.687470] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.687486] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:19:31.644 [2024-12-15 05:09:51.687495] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.152 ms 00:19:31.644 [2024-12-15 05:09:51.687504] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.695658] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.695702] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:19:31.644 [2024-12-15 05:09:51.695719] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.130 ms 00:19:31.644 [2024-12-15 05:09:51.695729] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.699476] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 2, empty chunks = 2 00:19:31.644 [2024-12-15 05:09:51.699518] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:19:31.644 [2024-12-15 05:09:51.699530] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.699538] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:19:31.644 [2024-12-15 05:09:51.699547] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.701 ms 00:19:31.644 [2024-12-15 05:09:51.699555] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.715656] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.715703] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:19:31.644 [2024-12-15 05:09:51.715716] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 16.040 ms 00:19:31.644 [2024-12-15 05:09:51.715723] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.718602] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.718644] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:19:31.644 [2024-12-15 05:09:51.718654] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.790 ms 00:19:31.644 [2024-12-15 05:09:51.718661] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.721147] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.721199] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:19:31.644 [2024-12-15 05:09:51.721209] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.435 ms 00:19:31.644 [2024-12-15 05:09:51.721216] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.721625] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.721639] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:19:31.644 [2024-12-15 05:09:51.721649] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.331 ms 00:19:31.644 [2024-12-15 05:09:51.721657] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.746134] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.746200] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:19:31.644 [2024-12-15 05:09:51.746223] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 24.450 ms 00:19:31.644 [2024-12-15 05:09:51.746233] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.754516] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 59 (of 60) MiB 00:19:31.644 [2024-12-15 05:09:51.773314] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.773557] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:19:31.644 [2024-12-15 05:09:51.773579] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 26.979 ms 00:19:31.644 [2024-12-15 05:09:51.773589] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.773684] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.773701] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:19:31.644 [2024-12-15 05:09:51.773715] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.018 ms 00:19:31.644 [2024-12-15 05:09:51.773723] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.773786] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.773796] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:19:31.644 [2024-12-15 05:09:51.773810] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.040 ms 00:19:31.644 [2024-12-15 05:09:51.773818] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.773845] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.773854] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:19:31.644 [2024-12-15 05:09:51.773863] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.010 ms 00:19:31.644 [2024-12-15 05:09:51.773874] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.644 [2024-12-15 05:09:51.773913] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:19:31.644 [2024-12-15 05:09:51.773925] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.644 [2024-12-15 05:09:51.773933] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:19:31.644 [2024-12-15 05:09:51.773942] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.014 ms 00:19:31.644 [2024-12-15 05:09:51.773950] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.905 [2024-12-15 05:09:51.779976] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.905 [2024-12-15 05:09:51.780026] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:19:31.905 [2024-12-15 05:09:51.780038] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.005 ms 00:19:31.905 [2024-12-15 05:09:51.780058] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.905 [2024-12-15 05:09:51.780159] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:31.905 [2024-12-15 05:09:51.780171] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:19:31.905 [2024-12-15 05:09:51.780181] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.038 ms 00:19:31.905 [2024-12-15 05:09:51.780189] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:31.905 [2024-12-15 05:09:51.781221] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:19:31.905 [2024-12-15 05:09:51.782568] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 153.472 ms, result 0 00:19:31.905 [2024-12-15 05:09:51.783994] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:19:31.905 [2024-12-15 05:09:51.791189] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:19:32.915  [2024-12-15T05:09:53.999Z] Copying: 14/256 [MB] (14 MBps) [2024-12-15T05:09:54.942Z] Copying: 29/256 [MB] (15 MBps) [2024-12-15T05:09:55.884Z] Copying: 42/256 [MB] (13 MBps) [2024-12-15T05:09:57.270Z] Copying: 58/256 [MB] (15 MBps) [2024-12-15T05:09:58.214Z] Copying: 71/256 [MB] (13 MBps) [2024-12-15T05:09:59.157Z] Copying: 82/256 [MB] (10 MBps) [2024-12-15T05:10:00.100Z] Copying: 98/256 [MB] (16 MBps) [2024-12-15T05:10:01.044Z] Copying: 114/256 [MB] (15 MBps) [2024-12-15T05:10:01.986Z] Copying: 132/256 [MB] (18 MBps) [2024-12-15T05:10:02.930Z] Copying: 154/256 [MB] (21 MBps) [2024-12-15T05:10:03.873Z] Copying: 165/256 [MB] (11 MBps) [2024-12-15T05:10:05.259Z] Copying: 175/256 [MB] (10 MBps) [2024-12-15T05:10:06.202Z] Copying: 199/256 [MB] (23 MBps) [2024-12-15T05:10:07.146Z] Copying: 218/256 [MB] (19 MBps) [2024-12-15T05:10:08.090Z] Copying: 233/256 [MB] (14 MBps) [2024-12-15T05:10:08.351Z] Copying: 248/256 [MB] (15 MBps) [2024-12-15T05:10:08.925Z] Copying: 256/256 [MB] (average 15 MBps)[2024-12-15 05:10:08.736259] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:19:48.785 [2024-12-15 05:10:08.739261] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.785 [2024-12-15 05:10:08.739558] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:19:48.785 [2024-12-15 05:10:08.739851] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.007 ms 00:19:48.785 [2024-12-15 05:10:08.739886] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.785 [2024-12-15 05:10:08.739954] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on ftl_core_thread 00:19:48.785 [2024-12-15 05:10:08.740882] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.785 [2024-12-15 05:10:08.740928] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:19:48.785 [2024-12-15 05:10:08.740941] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.897 ms 00:19:48.785 [2024-12-15 05:10:08.740950] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.785 [2024-12-15 05:10:08.741282] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.785 [2024-12-15 05:10:08.741302] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:19:48.785 [2024-12-15 05:10:08.741317] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.301 ms 00:19:48.785 [2024-12-15 05:10:08.741327] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.785 [2024-12-15 05:10:08.745733] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.785 [2024-12-15 05:10:08.745762] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:19:48.785 [2024-12-15 05:10:08.745774] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.387 ms 00:19:48.785 [2024-12-15 05:10:08.745784] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.785 [2024-12-15 05:10:08.753786] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.785 [2024-12-15 05:10:08.753845] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:19:48.785 [2024-12-15 05:10:08.753857] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 7.955 ms 00:19:48.785 [2024-12-15 05:10:08.753869] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.785 [2024-12-15 05:10:08.757209] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.785 [2024-12-15 05:10:08.757269] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:19:48.785 [2024-12-15 05:10:08.757280] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.285 ms 00:19:48.785 [2024-12-15 05:10:08.757289] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.785 [2024-12-15 05:10:08.762814] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.785 [2024-12-15 05:10:08.763024] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:19:48.785 [2024-12-15 05:10:08.763045] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.470 ms 00:19:48.785 [2024-12-15 05:10:08.763054] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.785 [2024-12-15 05:10:08.763299] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.785 [2024-12-15 05:10:08.763331] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:19:48.785 [2024-12-15 05:10:08.763346] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.105 ms 00:19:48.785 [2024-12-15 05:10:08.763354] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.785 [2024-12-15 05:10:08.767126] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.785 [2024-12-15 05:10:08.767180] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:19:48.785 [2024-12-15 05:10:08.767191] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.751 ms 00:19:48.785 [2024-12-15 05:10:08.767199] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.785 [2024-12-15 05:10:08.770355] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.785 [2024-12-15 05:10:08.770562] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:19:48.785 [2024-12-15 05:10:08.770582] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.106 ms 00:19:48.785 [2024-12-15 05:10:08.770591] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.785 [2024-12-15 05:10:08.773387] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.785 [2024-12-15 05:10:08.773472] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:19:48.785 [2024-12-15 05:10:08.773487] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.461 ms 00:19:48.785 [2024-12-15 05:10:08.773495] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.785 [2024-12-15 05:10:08.776203] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.785 [2024-12-15 05:10:08.776257] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:19:48.785 [2024-12-15 05:10:08.776268] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.616 ms 00:19:48.785 [2024-12-15 05:10:08.776276] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.785 [2024-12-15 05:10:08.776342] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:19:48.785 [2024-12-15 05:10:08.776361] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776383] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776392] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776401] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776409] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776418] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776426] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776466] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776475] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776483] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776491] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776500] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776508] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776516] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776525] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776533] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776541] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776549] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776557] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776567] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776577] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776585] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776593] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776600] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776609] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776618] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776627] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776634] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776642] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:19:48.785 [2024-12-15 05:10:08.776651] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776660] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776668] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776676] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776684] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776691] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776699] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776708] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776715] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776723] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776730] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776741] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776750] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776758] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776765] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776773] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776787] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776796] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776803] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776811] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776818] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776826] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776834] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776842] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776851] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776859] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776866] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776873] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776880] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776888] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776897] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776904] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776911] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776918] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776927] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776934] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776942] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776951] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776959] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776967] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776975] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776983] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.776991] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777001] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777009] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777017] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777024] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777031] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777039] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777049] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777056] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777064] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777071] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777079] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777086] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777095] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777104] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777112] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777120] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777127] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777134] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777142] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777149] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777158] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777166] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777174] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777182] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777199] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777209] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777218] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777226] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:19:48.786 [2024-12-15 05:10:08.777242] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:19:48.786 [2024-12-15 05:10:08.777252] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 3af9b670-83ba-4af2-8311-25d469da50d8 00:19:48.786 [2024-12-15 05:10:08.777262] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:19:48.786 [2024-12-15 05:10:08.777270] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:19:48.786 [2024-12-15 05:10:08.777278] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:19:48.786 [2024-12-15 05:10:08.777287] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:19:48.786 [2024-12-15 05:10:08.777295] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:19:48.786 [2024-12-15 05:10:08.777308] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:19:48.786 [2024-12-15 05:10:08.777316] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:19:48.786 [2024-12-15 05:10:08.777323] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:19:48.786 [2024-12-15 05:10:08.777330] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:19:48.786 [2024-12-15 05:10:08.777337] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.786 [2024-12-15 05:10:08.777345] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:19:48.786 [2024-12-15 05:10:08.777360] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.996 ms 00:19:48.786 [2024-12-15 05:10:08.777369] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.786 [2024-12-15 05:10:08.779765] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.786 [2024-12-15 05:10:08.779803] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:19:48.786 [2024-12-15 05:10:08.779817] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.375 ms 00:19:48.786 [2024-12-15 05:10:08.779834] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.786 [2024-12-15 05:10:08.779957] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:48.786 [2024-12-15 05:10:08.779968] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:19:48.786 [2024-12-15 05:10:08.779977] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.097 ms 00:19:48.786 [2024-12-15 05:10:08.779986] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.787 [2024-12-15 05:10:08.788256] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:48.787 [2024-12-15 05:10:08.788418] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:19:48.787 [2024-12-15 05:10:08.788488] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:48.787 [2024-12-15 05:10:08.788522] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.787 [2024-12-15 05:10:08.788623] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:48.787 [2024-12-15 05:10:08.788651] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:19:48.787 [2024-12-15 05:10:08.788677] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:48.787 [2024-12-15 05:10:08.788696] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.787 [2024-12-15 05:10:08.788758] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:48.787 [2024-12-15 05:10:08.788866] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:19:48.787 [2024-12-15 05:10:08.788888] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:48.787 [2024-12-15 05:10:08.788911] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.787 [2024-12-15 05:10:08.788945] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:48.787 [2024-12-15 05:10:08.788966] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:19:48.787 [2024-12-15 05:10:08.788985] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:48.787 [2024-12-15 05:10:08.789054] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.787 [2024-12-15 05:10:08.802454] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:48.787 [2024-12-15 05:10:08.802632] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:19:48.787 [2024-12-15 05:10:08.802689] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:48.787 [2024-12-15 05:10:08.802723] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.787 [2024-12-15 05:10:08.813329] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:48.787 [2024-12-15 05:10:08.813509] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:19:48.787 [2024-12-15 05:10:08.813564] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:48.787 [2024-12-15 05:10:08.813589] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.787 [2024-12-15 05:10:08.813654] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:48.787 [2024-12-15 05:10:08.813677] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:19:48.787 [2024-12-15 05:10:08.813699] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:48.787 [2024-12-15 05:10:08.813719] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.787 [2024-12-15 05:10:08.813765] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:48.787 [2024-12-15 05:10:08.813787] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:19:48.787 [2024-12-15 05:10:08.813806] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:48.787 [2024-12-15 05:10:08.813876] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.787 [2024-12-15 05:10:08.813975] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:48.787 [2024-12-15 05:10:08.814001] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:19:48.787 [2024-12-15 05:10:08.814022] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:48.787 [2024-12-15 05:10:08.814123] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.787 [2024-12-15 05:10:08.814170] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:48.787 [2024-12-15 05:10:08.814199] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:19:48.787 [2024-12-15 05:10:08.814262] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:48.787 [2024-12-15 05:10:08.814285] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.787 [2024-12-15 05:10:08.814369] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:48.787 [2024-12-15 05:10:08.814396] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:19:48.787 [2024-12-15 05:10:08.814480] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:48.787 [2024-12-15 05:10:08.814504] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.787 [2024-12-15 05:10:08.814574] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:48.787 [2024-12-15 05:10:08.814640] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:19:48.787 [2024-12-15 05:10:08.814666] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:48.787 [2024-12-15 05:10:08.814764] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:48.787 [2024-12-15 05:10:08.814934] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 75.671 ms, result 0 00:19:49.047 00:19:49.047 00:19:49.047 05:10:08 ftl.ftl_trim -- ftl/trim.sh@106 -- # md5sum -c /home/vagrant/spdk_repo/spdk/test/ftl/testfile.md5 00:19:49.618 /home/vagrant/spdk_repo/spdk/test/ftl/data: OK 00:19:49.618 05:10:09 ftl.ftl_trim -- ftl/trim.sh@108 -- # trap - SIGINT SIGTERM EXIT 00:19:49.618 05:10:09 ftl.ftl_trim -- ftl/trim.sh@109 -- # fio_kill 00:19:49.618 05:10:09 ftl.ftl_trim -- ftl/trim.sh@15 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/testfile.md5 00:19:49.618 05:10:09 ftl.ftl_trim -- ftl/trim.sh@16 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:19:49.618 05:10:09 ftl.ftl_trim -- ftl/trim.sh@17 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/random_pattern 00:19:49.618 05:10:09 ftl.ftl_trim -- ftl/trim.sh@18 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/data 00:19:49.618 05:10:09 ftl.ftl_trim -- ftl/trim.sh@20 -- # killprocess 89980 00:19:49.618 05:10:09 ftl.ftl_trim -- common/autotest_common.sh@954 -- # '[' -z 89980 ']' 00:19:49.618 05:10:09 ftl.ftl_trim -- common/autotest_common.sh@958 -- # kill -0 89980 00:19:49.618 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 958: kill: (89980) - No such process 00:19:49.618 Process with pid 89980 is not found 00:19:49.618 05:10:09 ftl.ftl_trim -- common/autotest_common.sh@981 -- # echo 'Process with pid 89980 is not found' 00:19:49.618 ************************************ 00:19:49.618 END TEST ftl_trim 00:19:49.618 ************************************ 00:19:49.618 00:19:49.618 real 1m15.124s 00:19:49.618 user 1m38.401s 00:19:49.618 sys 0m5.634s 00:19:49.618 05:10:09 ftl.ftl_trim -- common/autotest_common.sh@1130 -- # xtrace_disable 00:19:49.618 05:10:09 ftl.ftl_trim -- common/autotest_common.sh@10 -- # set +x 00:19:49.618 05:10:09 ftl -- ftl/ftl.sh@76 -- # run_test ftl_restore /home/vagrant/spdk_repo/spdk/test/ftl/restore.sh -c 0000:00:10.0 0000:00:11.0 00:19:49.618 05:10:09 ftl -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:19:49.618 05:10:09 ftl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:19:49.619 05:10:09 ftl -- common/autotest_common.sh@10 -- # set +x 00:19:49.619 ************************************ 00:19:49.619 START TEST ftl_restore 00:19:49.619 ************************************ 00:19:49.619 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/ftl/restore.sh -c 0000:00:10.0 0000:00:11.0 00:19:49.879 * Looking for test storage... 00:19:49.879 * Found test storage at /home/vagrant/spdk_repo/spdk/test/ftl 00:19:49.879 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:19:49.879 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@1711 -- # lcov --version 00:19:49.879 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:19:49.879 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:19:49.879 05:10:09 ftl.ftl_restore -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:19:49.879 05:10:09 ftl.ftl_restore -- scripts/common.sh@333 -- # local ver1 ver1_l 00:19:49.879 05:10:09 ftl.ftl_restore -- scripts/common.sh@334 -- # local ver2 ver2_l 00:19:49.879 05:10:09 ftl.ftl_restore -- scripts/common.sh@336 -- # IFS=.-: 00:19:49.879 05:10:09 ftl.ftl_restore -- scripts/common.sh@336 -- # read -ra ver1 00:19:49.879 05:10:09 ftl.ftl_restore -- scripts/common.sh@337 -- # IFS=.-: 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@337 -- # read -ra ver2 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@338 -- # local 'op=<' 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@340 -- # ver1_l=2 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@341 -- # ver2_l=1 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@344 -- # case "$op" in 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@345 -- # : 1 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@364 -- # (( v = 0 )) 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@365 -- # decimal 1 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@353 -- # local d=1 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@355 -- # echo 1 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@365 -- # ver1[v]=1 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@366 -- # decimal 2 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@353 -- # local d=2 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@355 -- # echo 2 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@366 -- # ver2[v]=2 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:19:49.880 05:10:09 ftl.ftl_restore -- scripts/common.sh@368 -- # return 0 00:19:49.880 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:19:49.880 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:19:49.880 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:49.880 --rc genhtml_branch_coverage=1 00:19:49.880 --rc genhtml_function_coverage=1 00:19:49.880 --rc genhtml_legend=1 00:19:49.880 --rc geninfo_all_blocks=1 00:19:49.880 --rc geninfo_unexecuted_blocks=1 00:19:49.880 00:19:49.880 ' 00:19:49.880 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:19:49.880 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:49.880 --rc genhtml_branch_coverage=1 00:19:49.880 --rc genhtml_function_coverage=1 00:19:49.880 --rc genhtml_legend=1 00:19:49.880 --rc geninfo_all_blocks=1 00:19:49.880 --rc geninfo_unexecuted_blocks=1 00:19:49.880 00:19:49.880 ' 00:19:49.880 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:19:49.880 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:49.880 --rc genhtml_branch_coverage=1 00:19:49.880 --rc genhtml_function_coverage=1 00:19:49.880 --rc genhtml_legend=1 00:19:49.880 --rc geninfo_all_blocks=1 00:19:49.880 --rc geninfo_unexecuted_blocks=1 00:19:49.880 00:19:49.880 ' 00:19:49.880 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:19:49.880 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:19:49.880 --rc genhtml_branch_coverage=1 00:19:49.880 --rc genhtml_function_coverage=1 00:19:49.880 --rc genhtml_legend=1 00:19:49.880 --rc geninfo_all_blocks=1 00:19:49.880 --rc geninfo_unexecuted_blocks=1 00:19:49.880 00:19:49.880 ' 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/ftl/common.sh 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@8 -- # dirname /home/vagrant/spdk_repo/spdk/test/ftl/restore.sh 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@8 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@8 -- # testdir=/home/vagrant/spdk_repo/spdk/test/ftl 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@9 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl/../.. 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@9 -- # rootdir=/home/vagrant/spdk_repo/spdk 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@10 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@12 -- # export 'ftl_tgt_core_mask=[0]' 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@12 -- # ftl_tgt_core_mask='[0]' 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@14 -- # export spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@14 -- # spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@15 -- # export 'spdk_tgt_cpumask=[0]' 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@15 -- # spdk_tgt_cpumask='[0]' 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@16 -- # export spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@16 -- # spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@17 -- # export spdk_tgt_pid= 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@17 -- # spdk_tgt_pid= 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@19 -- # export spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@19 -- # spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@20 -- # export 'spdk_ini_cpumask=[1]' 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@20 -- # spdk_ini_cpumask='[1]' 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@21 -- # export spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@21 -- # spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@22 -- # export spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@22 -- # spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@23 -- # export spdk_ini_pid= 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@23 -- # spdk_ini_pid= 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@25 -- # export spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/common.sh@25 -- # spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@11 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@13 -- # mktemp -d 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@13 -- # mount_dir=/tmp/tmp.7kP1mMrpr5 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@15 -- # getopts :u:c:f opt 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@16 -- # case $opt in 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@18 -- # nv_cache=0000:00:10.0 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@15 -- # getopts :u:c:f opt 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@23 -- # shift 2 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@24 -- # device=0000:00:11.0 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@25 -- # timeout=240 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@36 -- # trap 'restore_kill; exit 1' SIGINT SIGTERM EXIT 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@39 -- # svcpid=90280 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@41 -- # waitforlisten 90280 00:19:49.880 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@835 -- # '[' -z 90280 ']' 00:19:49.880 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:19:49.880 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@840 -- # local max_retries=100 00:19:49.880 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:19:49.880 05:10:09 ftl.ftl_restore -- ftl/restore.sh@38 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:19:49.880 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:19:49.880 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@844 -- # xtrace_disable 00:19:49.880 05:10:09 ftl.ftl_restore -- common/autotest_common.sh@10 -- # set +x 00:19:49.880 [2024-12-15 05:10:09.967919] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:19:49.880 [2024-12-15 05:10:09.968240] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid90280 ] 00:19:50.141 [2024-12-15 05:10:10.129996] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:50.141 [2024-12-15 05:10:10.159315] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:19:50.713 05:10:10 ftl.ftl_restore -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:19:50.713 05:10:10 ftl.ftl_restore -- common/autotest_common.sh@868 -- # return 0 00:19:50.713 05:10:10 ftl.ftl_restore -- ftl/restore.sh@43 -- # create_base_bdev nvme0 0000:00:11.0 103424 00:19:50.713 05:10:10 ftl.ftl_restore -- ftl/common.sh@54 -- # local name=nvme0 00:19:50.713 05:10:10 ftl.ftl_restore -- ftl/common.sh@55 -- # local base_bdf=0000:00:11.0 00:19:50.713 05:10:10 ftl.ftl_restore -- ftl/common.sh@56 -- # local size=103424 00:19:50.713 05:10:10 ftl.ftl_restore -- ftl/common.sh@59 -- # local base_bdev 00:19:50.713 05:10:10 ftl.ftl_restore -- ftl/common.sh@60 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t PCIe -a 0000:00:11.0 00:19:50.973 05:10:11 ftl.ftl_restore -- ftl/common.sh@60 -- # base_bdev=nvme0n1 00:19:50.973 05:10:11 ftl.ftl_restore -- ftl/common.sh@62 -- # local base_size 00:19:50.973 05:10:11 ftl.ftl_restore -- ftl/common.sh@63 -- # get_bdev_size nvme0n1 00:19:50.973 05:10:11 ftl.ftl_restore -- common/autotest_common.sh@1382 -- # local bdev_name=nvme0n1 00:19:50.973 05:10:11 ftl.ftl_restore -- common/autotest_common.sh@1383 -- # local bdev_info 00:19:50.973 05:10:11 ftl.ftl_restore -- common/autotest_common.sh@1384 -- # local bs 00:19:50.973 05:10:11 ftl.ftl_restore -- common/autotest_common.sh@1385 -- # local nb 00:19:50.973 05:10:11 ftl.ftl_restore -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b nvme0n1 00:19:51.234 05:10:11 ftl.ftl_restore -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:19:51.234 { 00:19:51.234 "name": "nvme0n1", 00:19:51.234 "aliases": [ 00:19:51.234 "9f9d57c7-b47a-4f64-8bc6-8c9ec586e5da" 00:19:51.234 ], 00:19:51.234 "product_name": "NVMe disk", 00:19:51.234 "block_size": 4096, 00:19:51.234 "num_blocks": 1310720, 00:19:51.234 "uuid": "9f9d57c7-b47a-4f64-8bc6-8c9ec586e5da", 00:19:51.234 "numa_id": -1, 00:19:51.234 "assigned_rate_limits": { 00:19:51.234 "rw_ios_per_sec": 0, 00:19:51.234 "rw_mbytes_per_sec": 0, 00:19:51.234 "r_mbytes_per_sec": 0, 00:19:51.234 "w_mbytes_per_sec": 0 00:19:51.234 }, 00:19:51.234 "claimed": true, 00:19:51.234 "claim_type": "read_many_write_one", 00:19:51.234 "zoned": false, 00:19:51.234 "supported_io_types": { 00:19:51.234 "read": true, 00:19:51.234 "write": true, 00:19:51.234 "unmap": true, 00:19:51.234 "flush": true, 00:19:51.234 "reset": true, 00:19:51.234 "nvme_admin": true, 00:19:51.234 "nvme_io": true, 00:19:51.234 "nvme_io_md": false, 00:19:51.234 "write_zeroes": true, 00:19:51.234 "zcopy": false, 00:19:51.234 "get_zone_info": false, 00:19:51.234 "zone_management": false, 00:19:51.234 "zone_append": false, 00:19:51.234 "compare": true, 00:19:51.234 "compare_and_write": false, 00:19:51.234 "abort": true, 00:19:51.234 "seek_hole": false, 00:19:51.234 "seek_data": false, 00:19:51.234 "copy": true, 00:19:51.234 "nvme_iov_md": false 00:19:51.234 }, 00:19:51.234 "driver_specific": { 00:19:51.234 "nvme": [ 00:19:51.234 { 00:19:51.234 "pci_address": "0000:00:11.0", 00:19:51.234 "trid": { 00:19:51.234 "trtype": "PCIe", 00:19:51.234 "traddr": "0000:00:11.0" 00:19:51.234 }, 00:19:51.234 "ctrlr_data": { 00:19:51.234 "cntlid": 0, 00:19:51.234 "vendor_id": "0x1b36", 00:19:51.234 "model_number": "QEMU NVMe Ctrl", 00:19:51.234 "serial_number": "12341", 00:19:51.234 "firmware_revision": "8.0.0", 00:19:51.234 "subnqn": "nqn.2019-08.org.qemu:12341", 00:19:51.234 "oacs": { 00:19:51.234 "security": 0, 00:19:51.234 "format": 1, 00:19:51.234 "firmware": 0, 00:19:51.234 "ns_manage": 1 00:19:51.234 }, 00:19:51.234 "multi_ctrlr": false, 00:19:51.234 "ana_reporting": false 00:19:51.234 }, 00:19:51.234 "vs": { 00:19:51.234 "nvme_version": "1.4" 00:19:51.234 }, 00:19:51.234 "ns_data": { 00:19:51.234 "id": 1, 00:19:51.234 "can_share": false 00:19:51.234 } 00:19:51.234 } 00:19:51.234 ], 00:19:51.234 "mp_policy": "active_passive" 00:19:51.234 } 00:19:51.234 } 00:19:51.234 ]' 00:19:51.234 05:10:11 ftl.ftl_restore -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:19:51.234 05:10:11 ftl.ftl_restore -- common/autotest_common.sh@1387 -- # bs=4096 00:19:51.234 05:10:11 ftl.ftl_restore -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:19:51.234 05:10:11 ftl.ftl_restore -- common/autotest_common.sh@1388 -- # nb=1310720 00:19:51.234 05:10:11 ftl.ftl_restore -- common/autotest_common.sh@1391 -- # bdev_size=5120 00:19:51.234 05:10:11 ftl.ftl_restore -- common/autotest_common.sh@1392 -- # echo 5120 00:19:51.234 05:10:11 ftl.ftl_restore -- ftl/common.sh@63 -- # base_size=5120 00:19:51.234 05:10:11 ftl.ftl_restore -- ftl/common.sh@64 -- # [[ 103424 -le 5120 ]] 00:19:51.234 05:10:11 ftl.ftl_restore -- ftl/common.sh@67 -- # clear_lvols 00:19:51.234 05:10:11 ftl.ftl_restore -- ftl/common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_get_lvstores 00:19:51.234 05:10:11 ftl.ftl_restore -- ftl/common.sh@28 -- # jq -r '.[] | .uuid' 00:19:51.495 05:10:11 ftl.ftl_restore -- ftl/common.sh@28 -- # stores=43f5c3f1-af03-43d0-b7e9-91da31196343 00:19:51.495 05:10:11 ftl.ftl_restore -- ftl/common.sh@29 -- # for lvs in $stores 00:19:51.495 05:10:11 ftl.ftl_restore -- ftl/common.sh@30 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -u 43f5c3f1-af03-43d0-b7e9-91da31196343 00:19:51.756 05:10:11 ftl.ftl_restore -- ftl/common.sh@68 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_create_lvstore nvme0n1 lvs 00:19:52.017 05:10:12 ftl.ftl_restore -- ftl/common.sh@68 -- # lvs=e7561374-28d2-4ab0-b451-50ad9d9ca932 00:19:52.017 05:10:12 ftl.ftl_restore -- ftl/common.sh@69 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_create nvme0n1p0 103424 -t -u e7561374-28d2-4ab0-b451-50ad9d9ca932 00:19:52.277 05:10:12 ftl.ftl_restore -- ftl/restore.sh@43 -- # split_bdev=f7c57030-e14e-43e0-b88b-217dbd3ad4a3 00:19:52.277 05:10:12 ftl.ftl_restore -- ftl/restore.sh@44 -- # '[' -n 0000:00:10.0 ']' 00:19:52.277 05:10:12 ftl.ftl_restore -- ftl/restore.sh@45 -- # create_nv_cache_bdev nvc0 0000:00:10.0 f7c57030-e14e-43e0-b88b-217dbd3ad4a3 00:19:52.277 05:10:12 ftl.ftl_restore -- ftl/common.sh@35 -- # local name=nvc0 00:19:52.277 05:10:12 ftl.ftl_restore -- ftl/common.sh@36 -- # local cache_bdf=0000:00:10.0 00:19:52.277 05:10:12 ftl.ftl_restore -- ftl/common.sh@37 -- # local base_bdev=f7c57030-e14e-43e0-b88b-217dbd3ad4a3 00:19:52.277 05:10:12 ftl.ftl_restore -- ftl/common.sh@38 -- # local cache_size= 00:19:52.277 05:10:12 ftl.ftl_restore -- ftl/common.sh@41 -- # get_bdev_size f7c57030-e14e-43e0-b88b-217dbd3ad4a3 00:19:52.277 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1382 -- # local bdev_name=f7c57030-e14e-43e0-b88b-217dbd3ad4a3 00:19:52.277 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1383 -- # local bdev_info 00:19:52.277 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1384 -- # local bs 00:19:52.277 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1385 -- # local nb 00:19:52.277 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b f7c57030-e14e-43e0-b88b-217dbd3ad4a3 00:19:52.538 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:19:52.538 { 00:19:52.538 "name": "f7c57030-e14e-43e0-b88b-217dbd3ad4a3", 00:19:52.538 "aliases": [ 00:19:52.538 "lvs/nvme0n1p0" 00:19:52.538 ], 00:19:52.538 "product_name": "Logical Volume", 00:19:52.538 "block_size": 4096, 00:19:52.538 "num_blocks": 26476544, 00:19:52.538 "uuid": "f7c57030-e14e-43e0-b88b-217dbd3ad4a3", 00:19:52.538 "assigned_rate_limits": { 00:19:52.538 "rw_ios_per_sec": 0, 00:19:52.538 "rw_mbytes_per_sec": 0, 00:19:52.538 "r_mbytes_per_sec": 0, 00:19:52.538 "w_mbytes_per_sec": 0 00:19:52.538 }, 00:19:52.538 "claimed": false, 00:19:52.538 "zoned": false, 00:19:52.538 "supported_io_types": { 00:19:52.538 "read": true, 00:19:52.538 "write": true, 00:19:52.538 "unmap": true, 00:19:52.538 "flush": false, 00:19:52.538 "reset": true, 00:19:52.538 "nvme_admin": false, 00:19:52.538 "nvme_io": false, 00:19:52.538 "nvme_io_md": false, 00:19:52.538 "write_zeroes": true, 00:19:52.538 "zcopy": false, 00:19:52.538 "get_zone_info": false, 00:19:52.538 "zone_management": false, 00:19:52.538 "zone_append": false, 00:19:52.538 "compare": false, 00:19:52.538 "compare_and_write": false, 00:19:52.538 "abort": false, 00:19:52.538 "seek_hole": true, 00:19:52.538 "seek_data": true, 00:19:52.538 "copy": false, 00:19:52.538 "nvme_iov_md": false 00:19:52.538 }, 00:19:52.538 "driver_specific": { 00:19:52.538 "lvol": { 00:19:52.539 "lvol_store_uuid": "e7561374-28d2-4ab0-b451-50ad9d9ca932", 00:19:52.539 "base_bdev": "nvme0n1", 00:19:52.539 "thin_provision": true, 00:19:52.539 "num_allocated_clusters": 0, 00:19:52.539 "snapshot": false, 00:19:52.539 "clone": false, 00:19:52.539 "esnap_clone": false 00:19:52.539 } 00:19:52.539 } 00:19:52.539 } 00:19:52.539 ]' 00:19:52.539 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:19:52.539 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1387 -- # bs=4096 00:19:52.539 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:19:52.539 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1388 -- # nb=26476544 00:19:52.539 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:19:52.539 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1392 -- # echo 103424 00:19:52.539 05:10:12 ftl.ftl_restore -- ftl/common.sh@41 -- # local base_size=5171 00:19:52.539 05:10:12 ftl.ftl_restore -- ftl/common.sh@44 -- # local nvc_bdev 00:19:52.539 05:10:12 ftl.ftl_restore -- ftl/common.sh@45 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvc0 -t PCIe -a 0000:00:10.0 00:19:52.800 05:10:12 ftl.ftl_restore -- ftl/common.sh@45 -- # nvc_bdev=nvc0n1 00:19:52.800 05:10:12 ftl.ftl_restore -- ftl/common.sh@47 -- # [[ -z '' ]] 00:19:52.800 05:10:12 ftl.ftl_restore -- ftl/common.sh@48 -- # get_bdev_size f7c57030-e14e-43e0-b88b-217dbd3ad4a3 00:19:52.800 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1382 -- # local bdev_name=f7c57030-e14e-43e0-b88b-217dbd3ad4a3 00:19:52.800 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1383 -- # local bdev_info 00:19:52.800 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1384 -- # local bs 00:19:52.800 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1385 -- # local nb 00:19:52.800 05:10:12 ftl.ftl_restore -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b f7c57030-e14e-43e0-b88b-217dbd3ad4a3 00:19:53.061 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:19:53.061 { 00:19:53.061 "name": "f7c57030-e14e-43e0-b88b-217dbd3ad4a3", 00:19:53.061 "aliases": [ 00:19:53.061 "lvs/nvme0n1p0" 00:19:53.061 ], 00:19:53.061 "product_name": "Logical Volume", 00:19:53.061 "block_size": 4096, 00:19:53.061 "num_blocks": 26476544, 00:19:53.061 "uuid": "f7c57030-e14e-43e0-b88b-217dbd3ad4a3", 00:19:53.061 "assigned_rate_limits": { 00:19:53.061 "rw_ios_per_sec": 0, 00:19:53.061 "rw_mbytes_per_sec": 0, 00:19:53.061 "r_mbytes_per_sec": 0, 00:19:53.061 "w_mbytes_per_sec": 0 00:19:53.061 }, 00:19:53.061 "claimed": false, 00:19:53.061 "zoned": false, 00:19:53.061 "supported_io_types": { 00:19:53.061 "read": true, 00:19:53.061 "write": true, 00:19:53.061 "unmap": true, 00:19:53.061 "flush": false, 00:19:53.061 "reset": true, 00:19:53.061 "nvme_admin": false, 00:19:53.061 "nvme_io": false, 00:19:53.061 "nvme_io_md": false, 00:19:53.061 "write_zeroes": true, 00:19:53.061 "zcopy": false, 00:19:53.061 "get_zone_info": false, 00:19:53.061 "zone_management": false, 00:19:53.061 "zone_append": false, 00:19:53.061 "compare": false, 00:19:53.061 "compare_and_write": false, 00:19:53.061 "abort": false, 00:19:53.061 "seek_hole": true, 00:19:53.061 "seek_data": true, 00:19:53.061 "copy": false, 00:19:53.061 "nvme_iov_md": false 00:19:53.061 }, 00:19:53.061 "driver_specific": { 00:19:53.061 "lvol": { 00:19:53.061 "lvol_store_uuid": "e7561374-28d2-4ab0-b451-50ad9d9ca932", 00:19:53.061 "base_bdev": "nvme0n1", 00:19:53.061 "thin_provision": true, 00:19:53.061 "num_allocated_clusters": 0, 00:19:53.061 "snapshot": false, 00:19:53.061 "clone": false, 00:19:53.061 "esnap_clone": false 00:19:53.061 } 00:19:53.061 } 00:19:53.061 } 00:19:53.061 ]' 00:19:53.061 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:19:53.061 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1387 -- # bs=4096 00:19:53.061 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:19:53.061 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1388 -- # nb=26476544 00:19:53.061 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:19:53.061 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1392 -- # echo 103424 00:19:53.061 05:10:13 ftl.ftl_restore -- ftl/common.sh@48 -- # cache_size=5171 00:19:53.061 05:10:13 ftl.ftl_restore -- ftl/common.sh@50 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_split_create nvc0n1 -s 5171 1 00:19:53.322 05:10:13 ftl.ftl_restore -- ftl/restore.sh@45 -- # nvc_bdev=nvc0n1p0 00:19:53.322 05:10:13 ftl.ftl_restore -- ftl/restore.sh@48 -- # get_bdev_size f7c57030-e14e-43e0-b88b-217dbd3ad4a3 00:19:53.322 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1382 -- # local bdev_name=f7c57030-e14e-43e0-b88b-217dbd3ad4a3 00:19:53.322 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1383 -- # local bdev_info 00:19:53.322 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1384 -- # local bs 00:19:53.322 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1385 -- # local nb 00:19:53.322 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b f7c57030-e14e-43e0-b88b-217dbd3ad4a3 00:19:53.582 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:19:53.582 { 00:19:53.582 "name": "f7c57030-e14e-43e0-b88b-217dbd3ad4a3", 00:19:53.582 "aliases": [ 00:19:53.582 "lvs/nvme0n1p0" 00:19:53.582 ], 00:19:53.582 "product_name": "Logical Volume", 00:19:53.582 "block_size": 4096, 00:19:53.582 "num_blocks": 26476544, 00:19:53.582 "uuid": "f7c57030-e14e-43e0-b88b-217dbd3ad4a3", 00:19:53.582 "assigned_rate_limits": { 00:19:53.582 "rw_ios_per_sec": 0, 00:19:53.582 "rw_mbytes_per_sec": 0, 00:19:53.582 "r_mbytes_per_sec": 0, 00:19:53.582 "w_mbytes_per_sec": 0 00:19:53.582 }, 00:19:53.582 "claimed": false, 00:19:53.582 "zoned": false, 00:19:53.582 "supported_io_types": { 00:19:53.582 "read": true, 00:19:53.582 "write": true, 00:19:53.582 "unmap": true, 00:19:53.582 "flush": false, 00:19:53.582 "reset": true, 00:19:53.582 "nvme_admin": false, 00:19:53.582 "nvme_io": false, 00:19:53.582 "nvme_io_md": false, 00:19:53.582 "write_zeroes": true, 00:19:53.582 "zcopy": false, 00:19:53.582 "get_zone_info": false, 00:19:53.582 "zone_management": false, 00:19:53.582 "zone_append": false, 00:19:53.582 "compare": false, 00:19:53.582 "compare_and_write": false, 00:19:53.582 "abort": false, 00:19:53.582 "seek_hole": true, 00:19:53.582 "seek_data": true, 00:19:53.582 "copy": false, 00:19:53.582 "nvme_iov_md": false 00:19:53.582 }, 00:19:53.582 "driver_specific": { 00:19:53.582 "lvol": { 00:19:53.582 "lvol_store_uuid": "e7561374-28d2-4ab0-b451-50ad9d9ca932", 00:19:53.582 "base_bdev": "nvme0n1", 00:19:53.582 "thin_provision": true, 00:19:53.582 "num_allocated_clusters": 0, 00:19:53.582 "snapshot": false, 00:19:53.582 "clone": false, 00:19:53.582 "esnap_clone": false 00:19:53.582 } 00:19:53.582 } 00:19:53.582 } 00:19:53.582 ]' 00:19:53.582 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:19:53.582 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1387 -- # bs=4096 00:19:53.582 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:19:53.582 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1388 -- # nb=26476544 00:19:53.582 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:19:53.582 05:10:13 ftl.ftl_restore -- common/autotest_common.sh@1392 -- # echo 103424 00:19:53.582 05:10:13 ftl.ftl_restore -- ftl/restore.sh@48 -- # l2p_dram_size_mb=10 00:19:53.582 05:10:13 ftl.ftl_restore -- ftl/restore.sh@49 -- # ftl_construct_args='bdev_ftl_create -b ftl0 -d f7c57030-e14e-43e0-b88b-217dbd3ad4a3 --l2p_dram_limit 10' 00:19:53.582 05:10:13 ftl.ftl_restore -- ftl/restore.sh@51 -- # '[' -n '' ']' 00:19:53.582 05:10:13 ftl.ftl_restore -- ftl/restore.sh@52 -- # '[' -n 0000:00:10.0 ']' 00:19:53.582 05:10:13 ftl.ftl_restore -- ftl/restore.sh@52 -- # ftl_construct_args+=' -c nvc0n1p0' 00:19:53.582 05:10:13 ftl.ftl_restore -- ftl/restore.sh@54 -- # '[' '' -eq 1 ']' 00:19:53.582 /home/vagrant/spdk_repo/spdk/test/ftl/restore.sh: line 54: [: : integer expression expected 00:19:53.583 05:10:13 ftl.ftl_restore -- ftl/restore.sh@58 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -t 240 bdev_ftl_create -b ftl0 -d f7c57030-e14e-43e0-b88b-217dbd3ad4a3 --l2p_dram_limit 10 -c nvc0n1p0 00:19:53.844 [2024-12-15 05:10:13.790674] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:53.844 [2024-12-15 05:10:13.790917] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:19:53.844 [2024-12-15 05:10:13.790944] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:19:53.844 [2024-12-15 05:10:13.790956] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:53.844 [2024-12-15 05:10:13.791041] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:53.844 [2024-12-15 05:10:13.791055] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:19:53.844 [2024-12-15 05:10:13.791068] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.058 ms 00:19:53.844 [2024-12-15 05:10:13.791085] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:53.844 [2024-12-15 05:10:13.791113] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:19:53.844 [2024-12-15 05:10:13.791584] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:19:53.844 [2024-12-15 05:10:13.791645] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:53.844 [2024-12-15 05:10:13.791673] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:19:53.844 [2024-12-15 05:10:13.791697] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.543 ms 00:19:53.844 [2024-12-15 05:10:13.791726] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:53.844 [2024-12-15 05:10:13.791826] mngt/ftl_mngt_md.c: 570:ftl_mngt_superblock_init: *NOTICE*: [FTL][ftl0] Create new FTL, UUID 30f623fd-11b6-47a5-ae87-857377a3ed0b 00:19:53.844 [2024-12-15 05:10:13.793722] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:53.844 [2024-12-15 05:10:13.793884] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Default-initialize superblock 00:19:53.844 [2024-12-15 05:10:13.793906] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.032 ms 00:19:53.844 [2024-12-15 05:10:13.793920] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:53.844 [2024-12-15 05:10:13.803067] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:53.844 [2024-12-15 05:10:13.803225] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:19:53.844 [2024-12-15 05:10:13.803293] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 9.091 ms 00:19:53.844 [2024-12-15 05:10:13.803322] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:53.844 [2024-12-15 05:10:13.803430] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:53.844 [2024-12-15 05:10:13.803478] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:19:53.844 [2024-12-15 05:10:13.803508] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.058 ms 00:19:53.844 [2024-12-15 05:10:13.803527] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:53.844 [2024-12-15 05:10:13.803609] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:53.844 [2024-12-15 05:10:13.803789] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:19:53.844 [2024-12-15 05:10:13.803819] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.013 ms 00:19:53.844 [2024-12-15 05:10:13.803840] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:53.844 [2024-12-15 05:10:13.803885] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:19:53.844 [2024-12-15 05:10:13.806175] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:53.844 [2024-12-15 05:10:13.806353] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:19:53.844 [2024-12-15 05:10:13.806371] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.301 ms 00:19:53.844 [2024-12-15 05:10:13.806383] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:53.844 [2024-12-15 05:10:13.806428] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:53.844 [2024-12-15 05:10:13.806466] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:19:53.844 [2024-12-15 05:10:13.806481] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.015 ms 00:19:53.844 [2024-12-15 05:10:13.806493] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:53.844 [2024-12-15 05:10:13.806514] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 1 00:19:53.844 [2024-12-15 05:10:13.806684] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:19:53.844 [2024-12-15 05:10:13.806700] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:19:53.844 [2024-12-15 05:10:13.806716] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:19:53.844 [2024-12-15 05:10:13.806727] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:19:53.844 [2024-12-15 05:10:13.806742] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:19:53.844 [2024-12-15 05:10:13.806750] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:19:53.844 [2024-12-15 05:10:13.806763] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:19:53.844 [2024-12-15 05:10:13.806771] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:19:53.844 [2024-12-15 05:10:13.806780] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:19:53.845 [2024-12-15 05:10:13.806788] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:53.845 [2024-12-15 05:10:13.806799] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:19:53.845 [2024-12-15 05:10:13.806808] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.276 ms 00:19:53.845 [2024-12-15 05:10:13.806818] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:53.845 [2024-12-15 05:10:13.806904] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:53.845 [2024-12-15 05:10:13.806920] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:19:53.845 [2024-12-15 05:10:13.806933] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.070 ms 00:19:53.845 [2024-12-15 05:10:13.806947] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:53.845 [2024-12-15 05:10:13.807048] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:19:53.845 [2024-12-15 05:10:13.807064] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:19:53.845 [2024-12-15 05:10:13.807077] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:19:53.845 [2024-12-15 05:10:13.807089] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:53.845 [2024-12-15 05:10:13.807102] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:19:53.845 [2024-12-15 05:10:13.807111] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:19:53.845 [2024-12-15 05:10:13.807119] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:19:53.845 [2024-12-15 05:10:13.807129] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:19:53.845 [2024-12-15 05:10:13.807139] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:19:53.845 [2024-12-15 05:10:13.807148] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:19:53.845 [2024-12-15 05:10:13.807156] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:19:53.845 [2024-12-15 05:10:13.807165] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:19:53.845 [2024-12-15 05:10:13.807175] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:19:53.845 [2024-12-15 05:10:13.807190] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:19:53.845 [2024-12-15 05:10:13.807198] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:19:53.845 [2024-12-15 05:10:13.807208] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:53.845 [2024-12-15 05:10:13.807216] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:19:53.845 [2024-12-15 05:10:13.807229] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:19:53.845 [2024-12-15 05:10:13.807238] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:53.845 [2024-12-15 05:10:13.807248] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:19:53.845 [2024-12-15 05:10:13.807257] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:19:53.845 [2024-12-15 05:10:13.807270] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:53.845 [2024-12-15 05:10:13.807279] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:19:53.845 [2024-12-15 05:10:13.807290] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:19:53.845 [2024-12-15 05:10:13.807298] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:53.845 [2024-12-15 05:10:13.807310] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:19:53.845 [2024-12-15 05:10:13.807318] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:19:53.845 [2024-12-15 05:10:13.807328] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:53.845 [2024-12-15 05:10:13.807336] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:19:53.845 [2024-12-15 05:10:13.807348] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:19:53.845 [2024-12-15 05:10:13.807355] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:19:53.845 [2024-12-15 05:10:13.807365] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:19:53.845 [2024-12-15 05:10:13.807373] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:19:53.845 [2024-12-15 05:10:13.807382] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:19:53.845 [2024-12-15 05:10:13.807389] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:19:53.845 [2024-12-15 05:10:13.807397] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:19:53.845 [2024-12-15 05:10:13.807404] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:19:53.845 [2024-12-15 05:10:13.807414] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:19:53.845 [2024-12-15 05:10:13.807421] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:19:53.845 [2024-12-15 05:10:13.807429] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:53.845 [2024-12-15 05:10:13.807452] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:19:53.845 [2024-12-15 05:10:13.807462] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:19:53.845 [2024-12-15 05:10:13.807468] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:53.845 [2024-12-15 05:10:13.807479] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:19:53.845 [2024-12-15 05:10:13.807488] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:19:53.845 [2024-12-15 05:10:13.807499] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:19:53.845 [2024-12-15 05:10:13.807508] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:19:53.845 [2024-12-15 05:10:13.807519] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:19:53.845 [2024-12-15 05:10:13.807527] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:19:53.845 [2024-12-15 05:10:13.807537] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:19:53.845 [2024-12-15 05:10:13.807545] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:19:53.845 [2024-12-15 05:10:13.807554] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:19:53.845 [2024-12-15 05:10:13.807561] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:19:53.845 [2024-12-15 05:10:13.807575] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:19:53.845 [2024-12-15 05:10:13.807588] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:19:53.845 [2024-12-15 05:10:13.807603] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:19:53.845 [2024-12-15 05:10:13.807611] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:19:53.845 [2024-12-15 05:10:13.807621] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:19:53.845 [2024-12-15 05:10:13.807628] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:19:53.845 [2024-12-15 05:10:13.807638] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:19:53.845 [2024-12-15 05:10:13.807645] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:19:53.845 [2024-12-15 05:10:13.807657] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:19:53.845 [2024-12-15 05:10:13.807665] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:19:53.845 [2024-12-15 05:10:13.807674] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:19:53.845 [2024-12-15 05:10:13.807681] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:19:53.845 [2024-12-15 05:10:13.807691] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:19:53.845 [2024-12-15 05:10:13.807698] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:19:53.845 [2024-12-15 05:10:13.807707] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:19:53.845 [2024-12-15 05:10:13.807715] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:19:53.845 [2024-12-15 05:10:13.807724] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:19:53.845 [2024-12-15 05:10:13.807737] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:19:53.845 [2024-12-15 05:10:13.807747] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:19:53.845 [2024-12-15 05:10:13.807754] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:19:53.845 [2024-12-15 05:10:13.807765] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:19:53.845 [2024-12-15 05:10:13.807773] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:19:53.845 [2024-12-15 05:10:13.807782] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:53.845 [2024-12-15 05:10:13.807791] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:19:53.845 [2024-12-15 05:10:13.807803] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.798 ms 00:19:53.845 [2024-12-15 05:10:13.807810] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:53.845 [2024-12-15 05:10:13.807877] mngt/ftl_mngt_misc.c: 165:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl0] NV cache data region needs scrubbing, this may take a while. 00:19:53.845 [2024-12-15 05:10:13.807889] mngt/ftl_mngt_misc.c: 166:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl0] Scrubbing 5 chunks 00:19:58.052 [2024-12-15 05:10:17.788412] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:17.788520] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Scrub NV cache 00:19:58.052 [2024-12-15 05:10:17.788541] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3980.507 ms 00:19:58.052 [2024-12-15 05:10:17.788559] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:17.803567] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:17.803797] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:19:58.052 [2024-12-15 05:10:17.803960] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 14.871 ms 00:19:58.052 [2024-12-15 05:10:17.803996] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:17.804159] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:17.804297] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:19:58.052 [2024-12-15 05:10:17.804327] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.078 ms 00:19:58.052 [2024-12-15 05:10:17.804347] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:17.816858] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:17.817050] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:19:58.052 [2024-12-15 05:10:17.817156] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 12.369 ms 00:19:58.052 [2024-12-15 05:10:17.817172] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:17.817212] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:17.817220] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:19:58.052 [2024-12-15 05:10:17.817232] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:19:58.052 [2024-12-15 05:10:17.817241] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:17.817859] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:17.817902] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:19:58.052 [2024-12-15 05:10:17.817917] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.533 ms 00:19:58.052 [2024-12-15 05:10:17.817927] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:17.818054] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:17.818065] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:19:58.052 [2024-12-15 05:10:17.818087] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.093 ms 00:19:58.052 [2024-12-15 05:10:17.818097] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:17.826469] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:17.826516] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:19:58.052 [2024-12-15 05:10:17.826530] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.344 ms 00:19:58.052 [2024-12-15 05:10:17.826539] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:17.849157] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 9 (of 10) MiB 00:19:58.052 [2024-12-15 05:10:17.853175] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:17.853233] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:19:58.052 [2024-12-15 05:10:17.853247] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 26.560 ms 00:19:58.052 [2024-12-15 05:10:17.853258] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:17.941395] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:17.941498] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear L2P 00:19:58.052 [2024-12-15 05:10:17.941517] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 88.088 ms 00:19:58.052 [2024-12-15 05:10:17.941531] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:17.941751] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:17.941768] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:19:58.052 [2024-12-15 05:10:17.941778] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.162 ms 00:19:58.052 [2024-12-15 05:10:17.941789] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:17.948580] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:17.948641] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Save initial band info metadata 00:19:58.052 [2024-12-15 05:10:17.948657] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.751 ms 00:19:58.052 [2024-12-15 05:10:17.948668] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:17.954697] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:17.954938] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Save initial chunk info metadata 00:19:58.052 [2024-12-15 05:10:17.954960] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.974 ms 00:19:58.052 [2024-12-15 05:10:17.954971] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:17.955313] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:17.955330] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:19:58.052 [2024-12-15 05:10:17.955342] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.297 ms 00:19:58.052 [2024-12-15 05:10:17.955354] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:18.004741] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:18.004807] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Wipe P2L region 00:19:58.052 [2024-12-15 05:10:18.004822] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 49.340 ms 00:19:58.052 [2024-12-15 05:10:18.004834] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:18.012197] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:18.012255] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear trim map 00:19:58.052 [2024-12-15 05:10:18.012267] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 7.285 ms 00:19:58.052 [2024-12-15 05:10:18.012279] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:18.018583] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:18.018636] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear trim log 00:19:58.052 [2024-12-15 05:10:18.018646] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.254 ms 00:19:58.052 [2024-12-15 05:10:18.018656] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:18.025366] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:18.025424] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:19:58.052 [2024-12-15 05:10:18.025492] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.661 ms 00:19:58.052 [2024-12-15 05:10:18.025506] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:18.025560] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:18.025583] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:19:58.052 [2024-12-15 05:10:18.025597] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.007 ms 00:19:58.052 [2024-12-15 05:10:18.025608] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:18.025710] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.052 [2024-12-15 05:10:18.025727] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:19:58.052 [2024-12-15 05:10:18.025736] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.036 ms 00:19:58.052 [2024-12-15 05:10:18.025749] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.052 [2024-12-15 05:10:18.026999] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 4235.879 ms, result 0 00:19:58.052 { 00:19:58.052 "name": "ftl0", 00:19:58.052 "uuid": "30f623fd-11b6-47a5-ae87-857377a3ed0b" 00:19:58.052 } 00:19:58.052 05:10:18 ftl.ftl_restore -- ftl/restore.sh@61 -- # echo '{"subsystems": [' 00:19:58.052 05:10:18 ftl.ftl_restore -- ftl/restore.sh@62 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py save_subsystem_config -n bdev 00:19:58.314 05:10:18 ftl.ftl_restore -- ftl/restore.sh@63 -- # echo ']}' 00:19:58.314 05:10:18 ftl.ftl_restore -- ftl/restore.sh@65 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_unload -b ftl0 00:19:58.580 [2024-12-15 05:10:18.462250] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.580 [2024-12-15 05:10:18.462316] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:19:58.580 [2024-12-15 05:10:18.462336] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:19:58.580 [2024-12-15 05:10:18.462345] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.580 [2024-12-15 05:10:18.462373] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:19:58.580 [2024-12-15 05:10:18.463147] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.580 [2024-12-15 05:10:18.463198] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:19:58.580 [2024-12-15 05:10:18.463214] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.755 ms 00:19:58.580 [2024-12-15 05:10:18.463226] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.580 [2024-12-15 05:10:18.463683] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.580 [2024-12-15 05:10:18.463773] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:19:58.580 [2024-12-15 05:10:18.463836] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.426 ms 00:19:58.580 [2024-12-15 05:10:18.463864] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.580 [2024-12-15 05:10:18.467166] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.580 [2024-12-15 05:10:18.467198] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:19:58.580 [2024-12-15 05:10:18.467209] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.243 ms 00:19:58.580 [2024-12-15 05:10:18.467220] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.580 [2024-12-15 05:10:18.473412] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.580 [2024-12-15 05:10:18.473613] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:19:58.580 [2024-12-15 05:10:18.473636] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.171 ms 00:19:58.580 [2024-12-15 05:10:18.473656] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.580 [2024-12-15 05:10:18.476966] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.580 [2024-12-15 05:10:18.477169] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:19:58.580 [2024-12-15 05:10:18.477189] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.218 ms 00:19:58.580 [2024-12-15 05:10:18.477203] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.580 [2024-12-15 05:10:18.484270] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.580 [2024-12-15 05:10:18.484506] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:19:58.580 [2024-12-15 05:10:18.484528] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 7.021 ms 00:19:58.580 [2024-12-15 05:10:18.484540] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.580 [2024-12-15 05:10:18.484680] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.580 [2024-12-15 05:10:18.484695] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:19:58.580 [2024-12-15 05:10:18.484709] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.090 ms 00:19:58.580 [2024-12-15 05:10:18.484721] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.580 [2024-12-15 05:10:18.488089] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.580 [2024-12-15 05:10:18.488304] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:19:58.580 [2024-12-15 05:10:18.488323] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.347 ms 00:19:58.580 [2024-12-15 05:10:18.488334] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.580 [2024-12-15 05:10:18.491408] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.580 [2024-12-15 05:10:18.491622] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:19:58.580 [2024-12-15 05:10:18.491641] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.028 ms 00:19:58.580 [2024-12-15 05:10:18.491651] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.580 [2024-12-15 05:10:18.494173] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.580 [2024-12-15 05:10:18.494237] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:19:58.580 [2024-12-15 05:10:18.494247] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.477 ms 00:19:58.580 [2024-12-15 05:10:18.494258] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.580 [2024-12-15 05:10:18.496612] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.580 [2024-12-15 05:10:18.496673] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:19:58.580 [2024-12-15 05:10:18.496684] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.277 ms 00:19:58.580 [2024-12-15 05:10:18.496699] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.580 [2024-12-15 05:10:18.496748] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:19:58.580 [2024-12-15 05:10:18.496768] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496786] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496797] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496807] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496821] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496831] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496842] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496850] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496860] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496868] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496877] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496885] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496894] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496903] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496913] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496920] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496929] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496938] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496949] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496957] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496969] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496977] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496988] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.496995] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.497004] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.497012] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:19:58.580 [2024-12-15 05:10:18.497025] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497032] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497043] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497051] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497061] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497071] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497081] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497089] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497099] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497107] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497121] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497128] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497138] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497145] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497155] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497163] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497174] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497181] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497191] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497198] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497208] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497216] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497226] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497233] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497245] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497254] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497266] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497274] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497283] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497291] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497301] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497309] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497326] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497334] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497345] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497353] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497364] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497372] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497382] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497391] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497402] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497410] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497422] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497429] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497459] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497467] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497477] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497485] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497496] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497504] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497516] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497523] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497533] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497540] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497553] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497562] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497572] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497580] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497593] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497610] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497620] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497627] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497637] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497645] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497655] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497663] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497672] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497680] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497693] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497703] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497713] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497721] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497732] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497741] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:19:58.581 [2024-12-15 05:10:18.497762] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:19:58.581 [2024-12-15 05:10:18.497772] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 30f623fd-11b6-47a5-ae87-857377a3ed0b 00:19:58.581 [2024-12-15 05:10:18.497784] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:19:58.581 [2024-12-15 05:10:18.497792] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:19:58.581 [2024-12-15 05:10:18.497802] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:19:58.581 [2024-12-15 05:10:18.497810] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:19:58.581 [2024-12-15 05:10:18.497820] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:19:58.581 [2024-12-15 05:10:18.497836] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:19:58.581 [2024-12-15 05:10:18.497846] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:19:58.581 [2024-12-15 05:10:18.497852] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:19:58.581 [2024-12-15 05:10:18.497862] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:19:58.581 [2024-12-15 05:10:18.497869] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.581 [2024-12-15 05:10:18.497879] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:19:58.581 [2024-12-15 05:10:18.497888] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.122 ms 00:19:58.581 [2024-12-15 05:10:18.497898] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.581 [2024-12-15 05:10:18.500388] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.581 [2024-12-15 05:10:18.500452] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:19:58.581 [2024-12-15 05:10:18.500464] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.468 ms 00:19:58.581 [2024-12-15 05:10:18.500479] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.581 [2024-12-15 05:10:18.500625] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:19:58.581 [2024-12-15 05:10:18.500638] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:19:58.581 [2024-12-15 05:10:18.500649] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.096 ms 00:19:58.581 [2024-12-15 05:10:18.500663] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.581 [2024-12-15 05:10:18.509224] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:58.582 [2024-12-15 05:10:18.509284] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:19:58.582 [2024-12-15 05:10:18.509299] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:58.582 [2024-12-15 05:10:18.509310] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.582 [2024-12-15 05:10:18.509380] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:58.582 [2024-12-15 05:10:18.509391] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:19:58.582 [2024-12-15 05:10:18.509399] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:58.582 [2024-12-15 05:10:18.509410] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.582 [2024-12-15 05:10:18.509526] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:58.582 [2024-12-15 05:10:18.509552] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:19:58.582 [2024-12-15 05:10:18.509561] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:58.582 [2024-12-15 05:10:18.509576] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.582 [2024-12-15 05:10:18.509595] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:58.582 [2024-12-15 05:10:18.509606] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:19:58.582 [2024-12-15 05:10:18.509614] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:58.582 [2024-12-15 05:10:18.509624] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.582 [2024-12-15 05:10:18.524019] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:58.582 [2024-12-15 05:10:18.524079] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:19:58.582 [2024-12-15 05:10:18.524106] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:58.582 [2024-12-15 05:10:18.524118] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.582 [2024-12-15 05:10:18.535087] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:58.582 [2024-12-15 05:10:18.535324] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:19:58.582 [2024-12-15 05:10:18.535343] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:58.582 [2024-12-15 05:10:18.535353] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.582 [2024-12-15 05:10:18.535472] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:58.582 [2024-12-15 05:10:18.535494] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:19:58.582 [2024-12-15 05:10:18.535503] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:58.582 [2024-12-15 05:10:18.535514] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.582 [2024-12-15 05:10:18.535567] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:58.582 [2024-12-15 05:10:18.535580] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:19:58.582 [2024-12-15 05:10:18.535589] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:58.582 [2024-12-15 05:10:18.535598] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.582 [2024-12-15 05:10:18.535677] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:58.582 [2024-12-15 05:10:18.535691] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:19:58.582 [2024-12-15 05:10:18.535701] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:58.582 [2024-12-15 05:10:18.535711] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.582 [2024-12-15 05:10:18.535751] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:58.582 [2024-12-15 05:10:18.535765] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:19:58.582 [2024-12-15 05:10:18.535774] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:58.582 [2024-12-15 05:10:18.535785] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.582 [2024-12-15 05:10:18.535823] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:58.582 [2024-12-15 05:10:18.535838] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:19:58.582 [2024-12-15 05:10:18.535848] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:58.582 [2024-12-15 05:10:18.535860] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.582 [2024-12-15 05:10:18.535909] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:19:58.582 [2024-12-15 05:10:18.535924] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:19:58.582 [2024-12-15 05:10:18.535933] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:19:58.582 [2024-12-15 05:10:18.535944] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:19:58.582 [2024-12-15 05:10:18.536084] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 73.795 ms, result 0 00:19:58.582 true 00:19:58.582 05:10:18 ftl.ftl_restore -- ftl/restore.sh@66 -- # killprocess 90280 00:19:58.582 05:10:18 ftl.ftl_restore -- common/autotest_common.sh@954 -- # '[' -z 90280 ']' 00:19:58.582 05:10:18 ftl.ftl_restore -- common/autotest_common.sh@958 -- # kill -0 90280 00:19:58.582 05:10:18 ftl.ftl_restore -- common/autotest_common.sh@959 -- # uname 00:19:58.582 05:10:18 ftl.ftl_restore -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:19:58.582 05:10:18 ftl.ftl_restore -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 90280 00:19:58.582 killing process with pid 90280 00:19:58.582 05:10:18 ftl.ftl_restore -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:19:58.582 05:10:18 ftl.ftl_restore -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:19:58.582 05:10:18 ftl.ftl_restore -- common/autotest_common.sh@972 -- # echo 'killing process with pid 90280' 00:19:58.582 05:10:18 ftl.ftl_restore -- common/autotest_common.sh@973 -- # kill 90280 00:19:58.582 05:10:18 ftl.ftl_restore -- common/autotest_common.sh@978 -- # wait 90280 00:20:03.963 05:10:23 ftl.ftl_restore -- ftl/restore.sh@69 -- # dd if=/dev/urandom of=/home/vagrant/spdk_repo/spdk/test/ftl/testfile bs=4K count=256K 00:20:08.172 262144+0 records in 00:20:08.172 262144+0 records out 00:20:08.172 1073741824 bytes (1.1 GB, 1.0 GiB) copied, 4.23632 s, 253 MB/s 00:20:08.172 05:10:27 ftl.ftl_restore -- ftl/restore.sh@70 -- # md5sum /home/vagrant/spdk_repo/spdk/test/ftl/testfile 00:20:09.558 05:10:29 ftl.ftl_restore -- ftl/restore.sh@73 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/ftl/testfile --ob=ftl0 --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:20:09.558 [2024-12-15 05:10:29.501615] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:20:09.558 [2024-12-15 05:10:29.501712] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid90499 ] 00:20:09.558 [2024-12-15 05:10:29.654365] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:09.558 [2024-12-15 05:10:29.675667] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:20:09.821 [2024-12-15 05:10:29.773803] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:20:09.821 [2024-12-15 05:10:29.774079] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:20:09.821 [2024-12-15 05:10:29.935132] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:09.821 [2024-12-15 05:10:29.935200] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:20:09.821 [2024-12-15 05:10:29.935216] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:20:09.821 [2024-12-15 05:10:29.935225] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:09.821 [2024-12-15 05:10:29.935283] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:09.821 [2024-12-15 05:10:29.935295] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:20:09.821 [2024-12-15 05:10:29.935304] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.039 ms 00:20:09.821 [2024-12-15 05:10:29.935312] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:09.821 [2024-12-15 05:10:29.935336] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:20:09.821 [2024-12-15 05:10:29.935667] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:20:09.821 [2024-12-15 05:10:29.935688] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:09.821 [2024-12-15 05:10:29.935700] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:20:09.821 [2024-12-15 05:10:29.935716] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.357 ms 00:20:09.821 [2024-12-15 05:10:29.935725] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:09.821 [2024-12-15 05:10:29.937548] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 0, shm_clean 0 00:20:09.821 [2024-12-15 05:10:29.941982] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:09.821 [2024-12-15 05:10:29.942037] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:20:09.821 [2024-12-15 05:10:29.942058] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.437 ms 00:20:09.821 [2024-12-15 05:10:29.942076] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:09.821 [2024-12-15 05:10:29.942159] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:09.821 [2024-12-15 05:10:29.942177] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:20:09.821 [2024-12-15 05:10:29.942187] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.031 ms 00:20:09.821 [2024-12-15 05:10:29.942194] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:09.821 [2024-12-15 05:10:29.950965] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:09.821 [2024-12-15 05:10:29.951215] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:20:09.821 [2024-12-15 05:10:29.951245] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.718 ms 00:20:09.821 [2024-12-15 05:10:29.951254] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:09.821 [2024-12-15 05:10:29.951366] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:09.822 [2024-12-15 05:10:29.951378] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:20:09.822 [2024-12-15 05:10:29.951388] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.076 ms 00:20:09.822 [2024-12-15 05:10:29.951398] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:09.822 [2024-12-15 05:10:29.951521] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:09.822 [2024-12-15 05:10:29.951534] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:20:09.822 [2024-12-15 05:10:29.951545] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.012 ms 00:20:09.822 [2024-12-15 05:10:29.951558] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:09.822 [2024-12-15 05:10:29.951582] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:20:09.822 [2024-12-15 05:10:29.953858] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:09.822 [2024-12-15 05:10:29.953900] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:20:09.822 [2024-12-15 05:10:29.953911] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.281 ms 00:20:09.822 [2024-12-15 05:10:29.953919] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:09.822 [2024-12-15 05:10:29.953960] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:09.822 [2024-12-15 05:10:29.953971] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:20:09.822 [2024-12-15 05:10:29.953981] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.014 ms 00:20:09.822 [2024-12-15 05:10:29.953994] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:09.822 [2024-12-15 05:10:29.954021] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:20:09.822 [2024-12-15 05:10:29.954045] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:20:09.822 [2024-12-15 05:10:29.954084] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:20:09.822 [2024-12-15 05:10:29.954103] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:20:09.822 [2024-12-15 05:10:29.954210] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:20:09.822 [2024-12-15 05:10:29.954222] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:20:09.822 [2024-12-15 05:10:29.954237] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:20:09.822 [2024-12-15 05:10:29.954250] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:20:09.822 [2024-12-15 05:10:29.954260] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:20:09.822 [2024-12-15 05:10:29.954269] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:20:09.822 [2024-12-15 05:10:29.954278] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:20:09.822 [2024-12-15 05:10:29.954286] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:20:09.822 [2024-12-15 05:10:29.954299] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:20:09.822 [2024-12-15 05:10:29.954311] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:09.822 [2024-12-15 05:10:29.954325] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:20:09.822 [2024-12-15 05:10:29.954335] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.296 ms 00:20:09.822 [2024-12-15 05:10:29.954345] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:09.822 [2024-12-15 05:10:29.954431] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:09.822 [2024-12-15 05:10:29.954690] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:20:09.822 [2024-12-15 05:10:29.954716] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.069 ms 00:20:09.822 [2024-12-15 05:10:29.954737] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:09.822 [2024-12-15 05:10:29.954861] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:20:09.822 [2024-12-15 05:10:29.954887] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:20:09.822 [2024-12-15 05:10:29.954908] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:20:09.822 [2024-12-15 05:10:29.954943] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:20:09.822 [2024-12-15 05:10:29.955027] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:20:09.822 [2024-12-15 05:10:29.955052] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:20:09.822 [2024-12-15 05:10:29.955073] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:20:09.822 [2024-12-15 05:10:29.955091] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:20:09.822 [2024-12-15 05:10:29.955113] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:20:09.822 [2024-12-15 05:10:29.955132] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:20:09.822 [2024-12-15 05:10:29.955153] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:20:09.822 [2024-12-15 05:10:29.955178] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:20:09.822 [2024-12-15 05:10:29.955234] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:20:09.822 [2024-12-15 05:10:29.955259] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:20:09.822 [2024-12-15 05:10:29.955278] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:20:09.822 [2024-12-15 05:10:29.955298] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:20:09.822 [2024-12-15 05:10:29.955317] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:20:09.822 [2024-12-15 05:10:29.955337] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:20:09.822 [2024-12-15 05:10:29.955356] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:20:09.822 [2024-12-15 05:10:29.955377] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:20:09.822 [2024-12-15 05:10:29.955395] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:20:09.822 [2024-12-15 05:10:29.955414] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:20:09.822 [2024-12-15 05:10:29.955450] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:20:09.822 [2024-12-15 05:10:29.955471] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:20:09.822 [2024-12-15 05:10:29.955548] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:20:09.822 [2024-12-15 05:10:29.955559] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:20:09.822 [2024-12-15 05:10:29.955567] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:20:09.822 [2024-12-15 05:10:29.955586] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:20:09.822 [2024-12-15 05:10:29.955596] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:20:09.822 [2024-12-15 05:10:29.955604] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:20:09.822 [2024-12-15 05:10:29.955611] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:20:09.822 [2024-12-15 05:10:29.955619] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:20:09.822 [2024-12-15 05:10:29.955627] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:20:09.822 [2024-12-15 05:10:29.955634] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:20:09.822 [2024-12-15 05:10:29.955641] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:20:09.822 [2024-12-15 05:10:29.955650] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:20:09.822 [2024-12-15 05:10:29.955659] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:20:09.822 [2024-12-15 05:10:29.955668] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:20:09.822 [2024-12-15 05:10:29.955676] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:20:09.822 [2024-12-15 05:10:29.955683] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:20:09.822 [2024-12-15 05:10:29.955690] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:20:09.822 [2024-12-15 05:10:29.955698] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:20:09.822 [2024-12-15 05:10:29.955705] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:20:09.822 [2024-12-15 05:10:29.955715] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:20:09.822 [2024-12-15 05:10:29.955731] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:20:09.822 [2024-12-15 05:10:29.955739] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:20:09.822 [2024-12-15 05:10:29.955747] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:20:09.822 [2024-12-15 05:10:29.955756] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:20:09.822 [2024-12-15 05:10:29.955762] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:20:09.822 [2024-12-15 05:10:29.955769] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:20:09.822 [2024-12-15 05:10:29.955777] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:20:09.822 [2024-12-15 05:10:29.955783] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:20:09.822 [2024-12-15 05:10:29.955791] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:20:09.822 [2024-12-15 05:10:29.955801] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:20:09.822 [2024-12-15 05:10:29.955812] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:20:09.822 [2024-12-15 05:10:29.955821] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:20:09.822 [2024-12-15 05:10:29.955828] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:20:09.822 [2024-12-15 05:10:29.955835] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:20:09.822 [2024-12-15 05:10:29.955844] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:20:09.822 [2024-12-15 05:10:29.955855] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:20:09.822 [2024-12-15 05:10:29.955862] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:20:09.822 [2024-12-15 05:10:29.955869] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:20:09.822 [2024-12-15 05:10:29.955877] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:20:09.822 [2024-12-15 05:10:29.955885] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:20:09.822 [2024-12-15 05:10:29.955892] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:20:09.822 [2024-12-15 05:10:29.955899] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:20:09.823 [2024-12-15 05:10:29.955906] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:20:09.823 [2024-12-15 05:10:29.955913] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:20:09.823 [2024-12-15 05:10:29.955920] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:20:09.823 [2024-12-15 05:10:29.955931] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:20:09.823 [2024-12-15 05:10:29.955941] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:20:09.823 [2024-12-15 05:10:29.955953] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:20:09.823 [2024-12-15 05:10:29.955961] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:20:09.823 [2024-12-15 05:10:29.955968] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:20:09.823 [2024-12-15 05:10:29.955978] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:20:09.823 [2024-12-15 05:10:29.955991] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:09.823 [2024-12-15 05:10:29.955999] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:20:09.823 [2024-12-15 05:10:29.956011] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.197 ms 00:20:09.823 [2024-12-15 05:10:29.956021] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:29.971529] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:29.971588] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:20:10.085 [2024-12-15 05:10:29.971602] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 15.429 ms 00:20:10.085 [2024-12-15 05:10:29.971611] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:29.971705] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:29.971721] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:20:10.085 [2024-12-15 05:10:29.971730] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.067 ms 00:20:10.085 [2024-12-15 05:10:29.971738] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:29.994986] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:29.995046] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:20:10.085 [2024-12-15 05:10:29.995061] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 23.185 ms 00:20:10.085 [2024-12-15 05:10:29.995071] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:29.995133] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:29.995148] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:20:10.085 [2024-12-15 05:10:29.995159] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:20:10.085 [2024-12-15 05:10:29.995168] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:29.995774] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:29.995809] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:20:10.085 [2024-12-15 05:10:29.995831] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.533 ms 00:20:10.085 [2024-12-15 05:10:29.995845] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:29.996015] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:29.996028] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:20:10.085 [2024-12-15 05:10:29.996039] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.136 ms 00:20:10.085 [2024-12-15 05:10:29.996049] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:30.004303] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:30.004353] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:20:10.085 [2024-12-15 05:10:30.004365] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.229 ms 00:20:10.085 [2024-12-15 05:10:30.004375] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:30.008515] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 0, empty chunks = 4 00:20:10.085 [2024-12-15 05:10:30.008564] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:20:10.085 [2024-12-15 05:10:30.008576] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:30.008585] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:20:10.085 [2024-12-15 05:10:30.008595] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.064 ms 00:20:10.085 [2024-12-15 05:10:30.008602] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:30.024772] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:30.024820] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:20:10.085 [2024-12-15 05:10:30.024833] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 16.107 ms 00:20:10.085 [2024-12-15 05:10:30.024842] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:30.027710] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:30.027759] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:20:10.085 [2024-12-15 05:10:30.027769] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.811 ms 00:20:10.085 [2024-12-15 05:10:30.027777] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:30.030543] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:30.030743] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:20:10.085 [2024-12-15 05:10:30.030762] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.718 ms 00:20:10.085 [2024-12-15 05:10:30.030771] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:30.031114] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:30.031129] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:20:10.085 [2024-12-15 05:10:30.031140] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.271 ms 00:20:10.085 [2024-12-15 05:10:30.031148] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:30.056737] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:30.056799] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:20:10.085 [2024-12-15 05:10:30.056814] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 25.565 ms 00:20:10.085 [2024-12-15 05:10:30.056823] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:30.064999] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 9 (of 10) MiB 00:20:10.085 [2024-12-15 05:10:30.068275] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:30.068327] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:20:10.085 [2024-12-15 05:10:30.068339] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 11.392 ms 00:20:10.085 [2024-12-15 05:10:30.068352] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:30.068460] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:30.068473] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:20:10.085 [2024-12-15 05:10:30.068482] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.038 ms 00:20:10.085 [2024-12-15 05:10:30.068491] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.085 [2024-12-15 05:10:30.068560] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.085 [2024-12-15 05:10:30.068571] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:20:10.086 [2024-12-15 05:10:30.068590] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.033 ms 00:20:10.086 [2024-12-15 05:10:30.068599] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.086 [2024-12-15 05:10:30.068622] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.086 [2024-12-15 05:10:30.068636] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:20:10.086 [2024-12-15 05:10:30.068645] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:20:10.086 [2024-12-15 05:10:30.068653] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.086 [2024-12-15 05:10:30.068693] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:20:10.086 [2024-12-15 05:10:30.068703] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.086 [2024-12-15 05:10:30.068712] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:20:10.086 [2024-12-15 05:10:30.068720] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.014 ms 00:20:10.086 [2024-12-15 05:10:30.068732] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.086 [2024-12-15 05:10:30.074181] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.086 [2024-12-15 05:10:30.074229] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:20:10.086 [2024-12-15 05:10:30.074241] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.422 ms 00:20:10.086 [2024-12-15 05:10:30.074249] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.086 [2024-12-15 05:10:30.074346] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:10.086 [2024-12-15 05:10:30.074357] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:20:10.086 [2024-12-15 05:10:30.074366] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.046 ms 00:20:10.086 [2024-12-15 05:10:30.074378] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:10.086 [2024-12-15 05:10:30.075553] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 139.944 ms, result 0 00:20:11.031  [2024-12-15T05:10:32.115Z] Copying: 10/1024 [MB] (10 MBps) [2024-12-15T05:10:33.503Z] Copying: 24/1024 [MB] (14 MBps) [2024-12-15T05:10:34.447Z] Copying: 60/1024 [MB] (36 MBps) [2024-12-15T05:10:35.389Z] Copying: 74/1024 [MB] (13 MBps) [2024-12-15T05:10:36.334Z] Copying: 104/1024 [MB] (30 MBps) [2024-12-15T05:10:37.278Z] Copying: 142/1024 [MB] (37 MBps) [2024-12-15T05:10:38.225Z] Copying: 173/1024 [MB] (30 MBps) [2024-12-15T05:10:39.168Z] Copying: 203/1024 [MB] (29 MBps) [2024-12-15T05:10:40.111Z] Copying: 240/1024 [MB] (37 MBps) [2024-12-15T05:10:41.499Z] Copying: 258/1024 [MB] (18 MBps) [2024-12-15T05:10:42.440Z] Copying: 282/1024 [MB] (23 MBps) [2024-12-15T05:10:43.384Z] Copying: 303/1024 [MB] (21 MBps) [2024-12-15T05:10:44.328Z] Copying: 324/1024 [MB] (21 MBps) [2024-12-15T05:10:45.274Z] Copying: 345/1024 [MB] (20 MBps) [2024-12-15T05:10:46.218Z] Copying: 367/1024 [MB] (22 MBps) [2024-12-15T05:10:47.162Z] Copying: 388/1024 [MB] (20 MBps) [2024-12-15T05:10:48.106Z] Copying: 414/1024 [MB] (25 MBps) [2024-12-15T05:10:49.522Z] Copying: 433/1024 [MB] (19 MBps) [2024-12-15T05:10:50.094Z] Copying: 456/1024 [MB] (22 MBps) [2024-12-15T05:10:51.481Z] Copying: 484/1024 [MB] (28 MBps) [2024-12-15T05:10:52.425Z] Copying: 503/1024 [MB] (19 MBps) [2024-12-15T05:10:53.369Z] Copying: 527/1024 [MB] (23 MBps) [2024-12-15T05:10:54.314Z] Copying: 545/1024 [MB] (17 MBps) [2024-12-15T05:10:55.257Z] Copying: 572/1024 [MB] (27 MBps) [2024-12-15T05:10:56.199Z] Copying: 602/1024 [MB] (30 MBps) [2024-12-15T05:10:57.142Z] Copying: 621/1024 [MB] (18 MBps) [2024-12-15T05:10:58.526Z] Copying: 637/1024 [MB] (15 MBps) [2024-12-15T05:10:59.098Z] Copying: 657/1024 [MB] (20 MBps) [2024-12-15T05:11:00.484Z] Copying: 672/1024 [MB] (14 MBps) [2024-12-15T05:11:01.426Z] Copying: 691/1024 [MB] (19 MBps) [2024-12-15T05:11:02.369Z] Copying: 710/1024 [MB] (18 MBps) [2024-12-15T05:11:03.312Z] Copying: 740/1024 [MB] (30 MBps) [2024-12-15T05:11:04.256Z] Copying: 758/1024 [MB] (17 MBps) [2024-12-15T05:11:05.200Z] Copying: 773/1024 [MB] (15 MBps) [2024-12-15T05:11:06.142Z] Copying: 789/1024 [MB] (15 MBps) [2024-12-15T05:11:07.528Z] Copying: 808/1024 [MB] (19 MBps) [2024-12-15T05:11:08.100Z] Copying: 833/1024 [MB] (24 MBps) [2024-12-15T05:11:09.490Z] Copying: 857/1024 [MB] (23 MBps) [2024-12-15T05:11:10.432Z] Copying: 867/1024 [MB] (10 MBps) [2024-12-15T05:11:11.376Z] Copying: 880/1024 [MB] (12 MBps) [2024-12-15T05:11:12.318Z] Copying: 893/1024 [MB] (13 MBps) [2024-12-15T05:11:13.262Z] Copying: 925/1024 [MB] (31 MBps) [2024-12-15T05:11:14.205Z] Copying: 941/1024 [MB] (15 MBps) [2024-12-15T05:11:15.203Z] Copying: 956/1024 [MB] (15 MBps) [2024-12-15T05:11:16.180Z] Copying: 982/1024 [MB] (25 MBps) [2024-12-15T05:11:17.123Z] Copying: 999/1024 [MB] (17 MBps) [2024-12-15T05:11:17.697Z] Copying: 1017/1024 [MB] (17 MBps) [2024-12-15T05:11:17.697Z] Copying: 1024/1024 [MB] (average 21 MBps)[2024-12-15 05:11:17.441718] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.557 [2024-12-15 05:11:17.441781] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:20:57.557 [2024-12-15 05:11:17.441797] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:20:57.557 [2024-12-15 05:11:17.441813] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.557 [2024-12-15 05:11:17.441835] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:20:57.557 [2024-12-15 05:11:17.442540] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.557 [2024-12-15 05:11:17.442567] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:20:57.557 [2024-12-15 05:11:17.442578] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.689 ms 00:20:57.557 [2024-12-15 05:11:17.442595] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.557 [2024-12-15 05:11:17.445449] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.557 [2024-12-15 05:11:17.445493] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:20:57.557 [2024-12-15 05:11:17.445504] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.810 ms 00:20:57.557 [2024-12-15 05:11:17.445512] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.557 [2024-12-15 05:11:17.463220] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.557 [2024-12-15 05:11:17.463278] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:20:57.557 [2024-12-15 05:11:17.463290] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 17.686 ms 00:20:57.557 [2024-12-15 05:11:17.463300] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.557 [2024-12-15 05:11:17.469661] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.557 [2024-12-15 05:11:17.469700] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:20:57.557 [2024-12-15 05:11:17.469710] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.322 ms 00:20:57.557 [2024-12-15 05:11:17.469718] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.557 [2024-12-15 05:11:17.472601] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.557 [2024-12-15 05:11:17.472788] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:20:57.557 [2024-12-15 05:11:17.472807] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.826 ms 00:20:57.557 [2024-12-15 05:11:17.472815] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.557 [2024-12-15 05:11:17.478111] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.557 [2024-12-15 05:11:17.478157] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:20:57.557 [2024-12-15 05:11:17.478169] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.060 ms 00:20:57.557 [2024-12-15 05:11:17.478178] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.557 [2024-12-15 05:11:17.478304] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.557 [2024-12-15 05:11:17.478317] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:20:57.557 [2024-12-15 05:11:17.478327] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.085 ms 00:20:57.557 [2024-12-15 05:11:17.478345] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.557 [2024-12-15 05:11:17.481760] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.557 [2024-12-15 05:11:17.481800] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:20:57.557 [2024-12-15 05:11:17.481810] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.396 ms 00:20:57.557 [2024-12-15 05:11:17.481817] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.557 [2024-12-15 05:11:17.483770] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.557 [2024-12-15 05:11:17.483810] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:20:57.557 [2024-12-15 05:11:17.483820] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.913 ms 00:20:57.557 [2024-12-15 05:11:17.483828] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.557 [2024-12-15 05:11:17.485459] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.557 [2024-12-15 05:11:17.485500] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:20:57.557 [2024-12-15 05:11:17.485510] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.593 ms 00:20:57.557 [2024-12-15 05:11:17.485519] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.557 [2024-12-15 05:11:17.487424] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.557 [2024-12-15 05:11:17.487490] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:20:57.557 [2024-12-15 05:11:17.487500] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.839 ms 00:20:57.557 [2024-12-15 05:11:17.487508] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.557 [2024-12-15 05:11:17.487582] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:20:57.557 [2024-12-15 05:11:17.487608] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487618] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487628] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487636] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487645] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487654] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487662] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487670] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487678] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487685] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487693] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487701] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487708] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487716] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487724] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487731] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487739] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487747] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487756] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487764] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487771] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487780] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487788] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487796] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487803] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487811] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:20:57.557 [2024-12-15 05:11:17.487820] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487828] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487836] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487845] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487853] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487862] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487869] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487877] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487884] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487894] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487902] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487910] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487917] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487925] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487934] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487942] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487949] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487957] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487965] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487973] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487982] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487989] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.487996] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488004] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488012] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488019] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488028] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488037] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488044] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488053] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488060] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488068] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488075] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488082] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488089] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488099] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488106] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488125] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488146] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488154] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488162] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488170] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488177] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488185] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488192] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488199] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488207] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488216] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488224] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488231] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488239] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488246] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488253] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488260] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488267] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488274] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488281] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488289] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488297] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488305] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488312] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488320] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488327] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488334] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488341] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488349] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488357] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488364] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488371] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488379] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488387] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488395] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488402] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488410] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:20:57.558 [2024-12-15 05:11:17.488425] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:20:57.558 [2024-12-15 05:11:17.488449] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 30f623fd-11b6-47a5-ae87-857377a3ed0b 00:20:57.558 [2024-12-15 05:11:17.488460] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:20:57.558 [2024-12-15 05:11:17.488467] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:20:57.558 [2024-12-15 05:11:17.488475] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:20:57.558 [2024-12-15 05:11:17.488484] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:20:57.558 [2024-12-15 05:11:17.488496] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:20:57.558 [2024-12-15 05:11:17.488505] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:20:57.558 [2024-12-15 05:11:17.488513] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:20:57.558 [2024-12-15 05:11:17.488520] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:20:57.558 [2024-12-15 05:11:17.488532] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:20:57.558 [2024-12-15 05:11:17.488539] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.558 [2024-12-15 05:11:17.488554] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:20:57.558 [2024-12-15 05:11:17.488564] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.958 ms 00:20:57.558 [2024-12-15 05:11:17.488571] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.558 [2024-12-15 05:11:17.490750] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.558 [2024-12-15 05:11:17.490788] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:20:57.558 [2024-12-15 05:11:17.490800] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.161 ms 00:20:57.558 [2024-12-15 05:11:17.490810] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.558 [2024-12-15 05:11:17.490923] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:57.559 [2024-12-15 05:11:17.490933] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:20:57.559 [2024-12-15 05:11:17.490943] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.091 ms 00:20:57.559 [2024-12-15 05:11:17.490956] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.559 [2024-12-15 05:11:17.498103] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:20:57.559 [2024-12-15 05:11:17.498146] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:20:57.559 [2024-12-15 05:11:17.498157] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:20:57.559 [2024-12-15 05:11:17.498165] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.559 [2024-12-15 05:11:17.498229] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:20:57.559 [2024-12-15 05:11:17.498238] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:20:57.559 [2024-12-15 05:11:17.498247] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:20:57.559 [2024-12-15 05:11:17.498254] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.559 [2024-12-15 05:11:17.498320] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:20:57.559 [2024-12-15 05:11:17.498338] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:20:57.559 [2024-12-15 05:11:17.498346] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:20:57.559 [2024-12-15 05:11:17.498354] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.559 [2024-12-15 05:11:17.498370] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:20:57.559 [2024-12-15 05:11:17.498381] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:20:57.559 [2024-12-15 05:11:17.498389] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:20:57.559 [2024-12-15 05:11:17.498397] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.559 [2024-12-15 05:11:17.511792] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:20:57.559 [2024-12-15 05:11:17.511840] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:20:57.559 [2024-12-15 05:11:17.511852] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:20:57.559 [2024-12-15 05:11:17.511861] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.559 [2024-12-15 05:11:17.522758] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:20:57.559 [2024-12-15 05:11:17.522812] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:20:57.559 [2024-12-15 05:11:17.522823] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:20:57.559 [2024-12-15 05:11:17.522832] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.559 [2024-12-15 05:11:17.522884] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:20:57.559 [2024-12-15 05:11:17.522895] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:20:57.559 [2024-12-15 05:11:17.522904] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:20:57.559 [2024-12-15 05:11:17.522912] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.559 [2024-12-15 05:11:17.522960] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:20:57.559 [2024-12-15 05:11:17.522970] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:20:57.559 [2024-12-15 05:11:17.522983] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:20:57.559 [2024-12-15 05:11:17.522991] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.559 [2024-12-15 05:11:17.523064] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:20:57.559 [2024-12-15 05:11:17.523075] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:20:57.559 [2024-12-15 05:11:17.523084] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:20:57.559 [2024-12-15 05:11:17.523092] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.559 [2024-12-15 05:11:17.523122] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:20:57.559 [2024-12-15 05:11:17.523132] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:20:57.559 [2024-12-15 05:11:17.523140] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:20:57.559 [2024-12-15 05:11:17.523151] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.559 [2024-12-15 05:11:17.523193] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:20:57.559 [2024-12-15 05:11:17.523204] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:20:57.559 [2024-12-15 05:11:17.523213] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:20:57.559 [2024-12-15 05:11:17.523221] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.559 [2024-12-15 05:11:17.523268] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:20:57.559 [2024-12-15 05:11:17.523280] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:20:57.559 [2024-12-15 05:11:17.523293] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:20:57.559 [2024-12-15 05:11:17.523301] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:57.559 [2024-12-15 05:11:17.523459] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 81.688 ms, result 0 00:20:57.820 00:20:57.820 00:20:57.820 05:11:17 ftl.ftl_restore -- ftl/restore.sh@74 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=ftl0 --of=/home/vagrant/spdk_repo/spdk/test/ftl/testfile --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json --count=262144 00:20:57.820 [2024-12-15 05:11:17.944801] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:20:57.820 [2024-12-15 05:11:17.945144] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid90999 ] 00:20:58.082 [2024-12-15 05:11:18.097030] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:58.082 [2024-12-15 05:11:18.126025] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:20:58.345 [2024-12-15 05:11:18.241552] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:20:58.345 [2024-12-15 05:11:18.241639] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:20:58.345 [2024-12-15 05:11:18.402809] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.345 [2024-12-15 05:11:18.402868] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:20:58.345 [2024-12-15 05:11:18.402883] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:20:58.345 [2024-12-15 05:11:18.402891] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.345 [2024-12-15 05:11:18.402946] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.345 [2024-12-15 05:11:18.402961] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:20:58.345 [2024-12-15 05:11:18.402969] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.035 ms 00:20:58.345 [2024-12-15 05:11:18.402977] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.345 [2024-12-15 05:11:18.403007] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:20:58.345 [2024-12-15 05:11:18.403422] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:20:58.345 [2024-12-15 05:11:18.403484] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.345 [2024-12-15 05:11:18.403498] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:20:58.345 [2024-12-15 05:11:18.403513] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.483 ms 00:20:58.345 [2024-12-15 05:11:18.403521] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.345 [2024-12-15 05:11:18.405277] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 0, shm_clean 0 00:20:58.345 [2024-12-15 05:11:18.409017] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.345 [2024-12-15 05:11:18.409066] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:20:58.345 [2024-12-15 05:11:18.409084] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.742 ms 00:20:58.345 [2024-12-15 05:11:18.409095] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.345 [2024-12-15 05:11:18.409167] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.345 [2024-12-15 05:11:18.409186] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:20:58.345 [2024-12-15 05:11:18.409199] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.026 ms 00:20:58.345 [2024-12-15 05:11:18.409207] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.345 [2024-12-15 05:11:18.417003] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.345 [2024-12-15 05:11:18.417044] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:20:58.345 [2024-12-15 05:11:18.417063] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 7.754 ms 00:20:58.345 [2024-12-15 05:11:18.417071] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.345 [2024-12-15 05:11:18.417170] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.345 [2024-12-15 05:11:18.417186] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:20:58.345 [2024-12-15 05:11:18.417198] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.074 ms 00:20:58.345 [2024-12-15 05:11:18.417214] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.345 [2024-12-15 05:11:18.417273] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.345 [2024-12-15 05:11:18.417286] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:20:58.345 [2024-12-15 05:11:18.417299] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.008 ms 00:20:58.345 [2024-12-15 05:11:18.417310] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.345 [2024-12-15 05:11:18.417333] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:20:58.345 [2024-12-15 05:11:18.419293] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.346 [2024-12-15 05:11:18.419324] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:20:58.346 [2024-12-15 05:11:18.419335] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.965 ms 00:20:58.346 [2024-12-15 05:11:18.419352] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.346 [2024-12-15 05:11:18.419390] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.346 [2024-12-15 05:11:18.419400] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:20:58.346 [2024-12-15 05:11:18.419409] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.014 ms 00:20:58.346 [2024-12-15 05:11:18.419420] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.346 [2024-12-15 05:11:18.419483] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:20:58.346 [2024-12-15 05:11:18.419510] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:20:58.346 [2024-12-15 05:11:18.419547] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:20:58.346 [2024-12-15 05:11:18.419567] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:20:58.346 [2024-12-15 05:11:18.419675] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:20:58.346 [2024-12-15 05:11:18.419688] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:20:58.346 [2024-12-15 05:11:18.419705] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:20:58.346 [2024-12-15 05:11:18.419715] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:20:58.346 [2024-12-15 05:11:18.419724] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:20:58.346 [2024-12-15 05:11:18.419733] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:20:58.346 [2024-12-15 05:11:18.419742] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:20:58.346 [2024-12-15 05:11:18.419750] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:20:58.346 [2024-12-15 05:11:18.419758] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:20:58.346 [2024-12-15 05:11:18.419769] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.346 [2024-12-15 05:11:18.419778] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:20:58.346 [2024-12-15 05:11:18.419788] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.292 ms 00:20:58.346 [2024-12-15 05:11:18.419798] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.346 [2024-12-15 05:11:18.419886] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.346 [2024-12-15 05:11:18.419896] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:20:58.346 [2024-12-15 05:11:18.419904] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.069 ms 00:20:58.346 [2024-12-15 05:11:18.419916] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.346 [2024-12-15 05:11:18.420012] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:20:58.346 [2024-12-15 05:11:18.420026] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:20:58.346 [2024-12-15 05:11:18.420043] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:20:58.346 [2024-12-15 05:11:18.420062] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:20:58.346 [2024-12-15 05:11:18.420072] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:20:58.346 [2024-12-15 05:11:18.420082] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:20:58.346 [2024-12-15 05:11:18.420091] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:20:58.346 [2024-12-15 05:11:18.420099] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:20:58.346 [2024-12-15 05:11:18.420108] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:20:58.346 [2024-12-15 05:11:18.420116] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:20:58.346 [2024-12-15 05:11:18.420126] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:20:58.346 [2024-12-15 05:11:18.420153] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:20:58.346 [2024-12-15 05:11:18.420163] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:20:58.346 [2024-12-15 05:11:18.420171] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:20:58.346 [2024-12-15 05:11:18.420179] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:20:58.346 [2024-12-15 05:11:18.420187] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:20:58.346 [2024-12-15 05:11:18.420196] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:20:58.346 [2024-12-15 05:11:18.420206] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:20:58.346 [2024-12-15 05:11:18.420214] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:20:58.346 [2024-12-15 05:11:18.420222] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:20:58.346 [2024-12-15 05:11:18.420231] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:20:58.346 [2024-12-15 05:11:18.420239] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:20:58.346 [2024-12-15 05:11:18.420247] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:20:58.346 [2024-12-15 05:11:18.420255] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:20:58.346 [2024-12-15 05:11:18.420265] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:20:58.346 [2024-12-15 05:11:18.420273] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:20:58.346 [2024-12-15 05:11:18.420281] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:20:58.346 [2024-12-15 05:11:18.420294] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:20:58.346 [2024-12-15 05:11:18.420302] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:20:58.346 [2024-12-15 05:11:18.420311] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:20:58.346 [2024-12-15 05:11:18.420319] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:20:58.346 [2024-12-15 05:11:18.420325] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:20:58.346 [2024-12-15 05:11:18.420332] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:20:58.346 [2024-12-15 05:11:18.420339] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:20:58.346 [2024-12-15 05:11:18.420345] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:20:58.346 [2024-12-15 05:11:18.420353] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:20:58.346 [2024-12-15 05:11:18.420360] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:20:58.346 [2024-12-15 05:11:18.420368] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:20:58.346 [2024-12-15 05:11:18.420375] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:20:58.346 [2024-12-15 05:11:18.420381] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:20:58.346 [2024-12-15 05:11:18.420388] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:20:58.346 [2024-12-15 05:11:18.420395] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:20:58.346 [2024-12-15 05:11:18.420403] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:20:58.346 [2024-12-15 05:11:18.420412] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:20:58.346 [2024-12-15 05:11:18.420422] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:20:58.346 [2024-12-15 05:11:18.420430] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:20:58.346 [2024-12-15 05:11:18.420454] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:20:58.346 [2024-12-15 05:11:18.420462] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:20:58.346 [2024-12-15 05:11:18.420473] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:20:58.346 [2024-12-15 05:11:18.420481] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:20:58.346 [2024-12-15 05:11:18.420488] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:20:58.346 [2024-12-15 05:11:18.420495] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:20:58.346 [2024-12-15 05:11:18.420501] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:20:58.346 [2024-12-15 05:11:18.420510] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:20:58.346 [2024-12-15 05:11:18.420519] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:20:58.346 [2024-12-15 05:11:18.420529] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:20:58.346 [2024-12-15 05:11:18.420537] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:20:58.346 [2024-12-15 05:11:18.420544] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:20:58.346 [2024-12-15 05:11:18.420552] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:20:58.346 [2024-12-15 05:11:18.420562] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:20:58.346 [2024-12-15 05:11:18.420571] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:20:58.346 [2024-12-15 05:11:18.420588] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:20:58.346 [2024-12-15 05:11:18.420595] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:20:58.346 [2024-12-15 05:11:18.420602] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:20:58.346 [2024-12-15 05:11:18.420610] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:20:58.346 [2024-12-15 05:11:18.420617] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:20:58.346 [2024-12-15 05:11:18.420624] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:20:58.346 [2024-12-15 05:11:18.420631] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:20:58.346 [2024-12-15 05:11:18.420639] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:20:58.346 [2024-12-15 05:11:18.420646] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:20:58.346 [2024-12-15 05:11:18.420655] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:20:58.346 [2024-12-15 05:11:18.420667] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:20:58.346 [2024-12-15 05:11:18.420675] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:20:58.347 [2024-12-15 05:11:18.420683] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:20:58.347 [2024-12-15 05:11:18.420691] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:20:58.347 [2024-12-15 05:11:18.420701] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.347 [2024-12-15 05:11:18.420708] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:20:58.347 [2024-12-15 05:11:18.420715] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.755 ms 00:20:58.347 [2024-12-15 05:11:18.420725] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.347 [2024-12-15 05:11:18.434303] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.347 [2024-12-15 05:11:18.434361] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:20:58.347 [2024-12-15 05:11:18.434373] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 13.533 ms 00:20:58.347 [2024-12-15 05:11:18.434383] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.347 [2024-12-15 05:11:18.434493] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.347 [2024-12-15 05:11:18.434504] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:20:58.347 [2024-12-15 05:11:18.434513] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.084 ms 00:20:58.347 [2024-12-15 05:11:18.434522] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.347 [2024-12-15 05:11:18.455022] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.347 [2024-12-15 05:11:18.455099] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:20:58.347 [2024-12-15 05:11:18.455118] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 20.439 ms 00:20:58.347 [2024-12-15 05:11:18.455130] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.347 [2024-12-15 05:11:18.455192] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.347 [2024-12-15 05:11:18.455207] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:20:58.347 [2024-12-15 05:11:18.455220] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:20:58.347 [2024-12-15 05:11:18.455232] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.347 [2024-12-15 05:11:18.455849] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.347 [2024-12-15 05:11:18.455891] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:20:58.347 [2024-12-15 05:11:18.455908] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.528 ms 00:20:58.347 [2024-12-15 05:11:18.455922] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.347 [2024-12-15 05:11:18.456157] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.347 [2024-12-15 05:11:18.456176] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:20:58.347 [2024-12-15 05:11:18.456189] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.195 ms 00:20:58.347 [2024-12-15 05:11:18.456201] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.347 [2024-12-15 05:11:18.464784] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.347 [2024-12-15 05:11:18.464828] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:20:58.347 [2024-12-15 05:11:18.464839] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.554 ms 00:20:58.347 [2024-12-15 05:11:18.464847] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.347 [2024-12-15 05:11:18.468656] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 2, empty chunks = 2 00:20:58.347 [2024-12-15 05:11:18.468709] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:20:58.347 [2024-12-15 05:11:18.468726] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.347 [2024-12-15 05:11:18.468735] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:20:58.347 [2024-12-15 05:11:18.468745] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.785 ms 00:20:58.347 [2024-12-15 05:11:18.468753] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.609 [2024-12-15 05:11:18.484857] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.609 [2024-12-15 05:11:18.484903] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:20:58.609 [2024-12-15 05:11:18.484916] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 16.043 ms 00:20:58.609 [2024-12-15 05:11:18.484925] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.609 [2024-12-15 05:11:18.487754] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.609 [2024-12-15 05:11:18.487798] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:20:58.609 [2024-12-15 05:11:18.487808] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.778 ms 00:20:58.609 [2024-12-15 05:11:18.487816] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.609 [2024-12-15 05:11:18.490528] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.609 [2024-12-15 05:11:18.490570] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:20:58.609 [2024-12-15 05:11:18.490581] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.668 ms 00:20:58.609 [2024-12-15 05:11:18.490589] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.609 [2024-12-15 05:11:18.490932] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.609 [2024-12-15 05:11:18.490946] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:20:58.609 [2024-12-15 05:11:18.490957] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.267 ms 00:20:58.609 [2024-12-15 05:11:18.490966] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.609 [2024-12-15 05:11:18.516921] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.609 [2024-12-15 05:11:18.516975] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:20:58.610 [2024-12-15 05:11:18.516988] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 25.930 ms 00:20:58.610 [2024-12-15 05:11:18.516998] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.610 [2024-12-15 05:11:18.525129] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 9 (of 10) MiB 00:20:58.610 [2024-12-15 05:11:18.528357] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.610 [2024-12-15 05:11:18.528403] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:20:58.610 [2024-12-15 05:11:18.528423] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 11.309 ms 00:20:58.610 [2024-12-15 05:11:18.528445] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.610 [2024-12-15 05:11:18.528522] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.610 [2024-12-15 05:11:18.528534] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:20:58.610 [2024-12-15 05:11:18.528544] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.012 ms 00:20:58.610 [2024-12-15 05:11:18.528554] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.610 [2024-12-15 05:11:18.528623] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.610 [2024-12-15 05:11:18.528640] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:20:58.610 [2024-12-15 05:11:18.528651] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.033 ms 00:20:58.610 [2024-12-15 05:11:18.528658] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.610 [2024-12-15 05:11:18.528683] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.610 [2024-12-15 05:11:18.528693] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:20:58.610 [2024-12-15 05:11:18.528707] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:20:58.610 [2024-12-15 05:11:18.528717] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.610 [2024-12-15 05:11:18.528755] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:20:58.610 [2024-12-15 05:11:18.528769] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.610 [2024-12-15 05:11:18.528778] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:20:58.610 [2024-12-15 05:11:18.528791] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.016 ms 00:20:58.610 [2024-12-15 05:11:18.528800] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.610 [2024-12-15 05:11:18.534493] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.610 [2024-12-15 05:11:18.534539] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:20:58.610 [2024-12-15 05:11:18.534551] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.671 ms 00:20:58.610 [2024-12-15 05:11:18.534560] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.610 [2024-12-15 05:11:18.534641] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:20:58.610 [2024-12-15 05:11:18.534650] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:20:58.610 [2024-12-15 05:11:18.534667] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.037 ms 00:20:58.610 [2024-12-15 05:11:18.534679] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:20:58.610 [2024-12-15 05:11:18.536275] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 132.995 ms, result 0 00:20:59.999  [2024-12-15T05:11:21.082Z] Copying: 11/1024 [MB] (11 MBps) [2024-12-15T05:11:22.026Z] Copying: 25/1024 [MB] (14 MBps) [2024-12-15T05:11:22.971Z] Copying: 37/1024 [MB] (11 MBps) [2024-12-15T05:11:23.914Z] Copying: 50/1024 [MB] (12 MBps) [2024-12-15T05:11:24.859Z] Copying: 74/1024 [MB] (23 MBps) [2024-12-15T05:11:25.803Z] Copying: 93/1024 [MB] (19 MBps) [2024-12-15T05:11:26.747Z] Copying: 114/1024 [MB] (20 MBps) [2024-12-15T05:11:28.135Z] Copying: 137/1024 [MB] (22 MBps) [2024-12-15T05:11:29.078Z] Copying: 155/1024 [MB] (18 MBps) [2024-12-15T05:11:30.022Z] Copying: 175/1024 [MB] (19 MBps) [2024-12-15T05:11:30.966Z] Copying: 194/1024 [MB] (19 MBps) [2024-12-15T05:11:31.910Z] Copying: 217/1024 [MB] (22 MBps) [2024-12-15T05:11:32.855Z] Copying: 233/1024 [MB] (16 MBps) [2024-12-15T05:11:33.797Z] Copying: 249/1024 [MB] (15 MBps) [2024-12-15T05:11:34.741Z] Copying: 264/1024 [MB] (15 MBps) [2024-12-15T05:11:36.127Z] Copying: 282/1024 [MB] (17 MBps) [2024-12-15T05:11:37.071Z] Copying: 299/1024 [MB] (17 MBps) [2024-12-15T05:11:38.015Z] Copying: 322/1024 [MB] (22 MBps) [2024-12-15T05:11:38.959Z] Copying: 333/1024 [MB] (10 MBps) [2024-12-15T05:11:39.902Z] Copying: 343/1024 [MB] (10 MBps) [2024-12-15T05:11:40.878Z] Copying: 354/1024 [MB] (10 MBps) [2024-12-15T05:11:41.822Z] Copying: 364/1024 [MB] (10 MBps) [2024-12-15T05:11:42.766Z] Copying: 375/1024 [MB] (10 MBps) [2024-12-15T05:11:44.152Z] Copying: 385/1024 [MB] (10 MBps) [2024-12-15T05:11:44.722Z] Copying: 396/1024 [MB] (10 MBps) [2024-12-15T05:11:46.107Z] Copying: 419/1024 [MB] (23 MBps) [2024-12-15T05:11:47.051Z] Copying: 431/1024 [MB] (11 MBps) [2024-12-15T05:11:47.994Z] Copying: 447/1024 [MB] (15 MBps) [2024-12-15T05:11:48.937Z] Copying: 459/1024 [MB] (12 MBps) [2024-12-15T05:11:49.880Z] Copying: 472/1024 [MB] (12 MBps) [2024-12-15T05:11:50.824Z] Copying: 491/1024 [MB] (19 MBps) [2024-12-15T05:11:51.768Z] Copying: 509/1024 [MB] (17 MBps) [2024-12-15T05:11:53.155Z] Copying: 524/1024 [MB] (14 MBps) [2024-12-15T05:11:53.727Z] Copying: 537/1024 [MB] (13 MBps) [2024-12-15T05:11:55.114Z] Copying: 556/1024 [MB] (19 MBps) [2024-12-15T05:11:56.050Z] Copying: 577/1024 [MB] (20 MBps) [2024-12-15T05:11:56.776Z] Copying: 588/1024 [MB] (11 MBps) [2024-12-15T05:11:57.718Z] Copying: 605/1024 [MB] (16 MBps) [2024-12-15T05:11:59.103Z] Copying: 626/1024 [MB] (20 MBps) [2024-12-15T05:12:00.041Z] Copying: 647/1024 [MB] (21 MBps) [2024-12-15T05:12:00.983Z] Copying: 661/1024 [MB] (14 MBps) [2024-12-15T05:12:01.926Z] Copying: 672/1024 [MB] (10 MBps) [2024-12-15T05:12:02.869Z] Copying: 682/1024 [MB] (10 MBps) [2024-12-15T05:12:03.813Z] Copying: 693/1024 [MB] (10 MBps) [2024-12-15T05:12:04.755Z] Copying: 704/1024 [MB] (10 MBps) [2024-12-15T05:12:06.138Z] Copying: 714/1024 [MB] (10 MBps) [2024-12-15T05:12:07.084Z] Copying: 730/1024 [MB] (16 MBps) [2024-12-15T05:12:07.722Z] Copying: 742/1024 [MB] (11 MBps) [2024-12-15T05:12:09.107Z] Copying: 753/1024 [MB] (11 MBps) [2024-12-15T05:12:10.051Z] Copying: 769/1024 [MB] (15 MBps) [2024-12-15T05:12:10.993Z] Copying: 783/1024 [MB] (13 MBps) [2024-12-15T05:12:11.936Z] Copying: 802/1024 [MB] (19 MBps) [2024-12-15T05:12:12.878Z] Copying: 817/1024 [MB] (14 MBps) [2024-12-15T05:12:13.822Z] Copying: 835/1024 [MB] (18 MBps) [2024-12-15T05:12:14.767Z] Copying: 849/1024 [MB] (13 MBps) [2024-12-15T05:12:16.152Z] Copying: 861/1024 [MB] (11 MBps) [2024-12-15T05:12:16.724Z] Copying: 879/1024 [MB] (17 MBps) [2024-12-15T05:12:18.111Z] Copying: 893/1024 [MB] (14 MBps) [2024-12-15T05:12:19.055Z] Copying: 911/1024 [MB] (17 MBps) [2024-12-15T05:12:19.999Z] Copying: 927/1024 [MB] (16 MBps) [2024-12-15T05:12:20.942Z] Copying: 945/1024 [MB] (18 MBps) [2024-12-15T05:12:21.887Z] Copying: 956/1024 [MB] (10 MBps) [2024-12-15T05:12:22.831Z] Copying: 967/1024 [MB] (10 MBps) [2024-12-15T05:12:23.778Z] Copying: 978/1024 [MB] (10 MBps) [2024-12-15T05:12:24.723Z] Copying: 988/1024 [MB] (10 MBps) [2024-12-15T05:12:26.107Z] Copying: 999/1024 [MB] (10 MBps) [2024-12-15T05:12:26.679Z] Copying: 1014/1024 [MB] (15 MBps) [2024-12-15T05:12:27.253Z] Copying: 1024/1024 [MB] (average 15 MBps)[2024-12-15 05:12:27.057955] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.113 [2024-12-15 05:12:27.058058] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:22:07.113 [2024-12-15 05:12:27.058084] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:22:07.113 [2024-12-15 05:12:27.058097] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.113 [2024-12-15 05:12:27.058127] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:22:07.113 [2024-12-15 05:12:27.059964] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.113 [2024-12-15 05:12:27.060073] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:22:07.113 [2024-12-15 05:12:27.060086] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.816 ms 00:22:07.113 [2024-12-15 05:12:27.060096] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.113 [2024-12-15 05:12:27.060406] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.113 [2024-12-15 05:12:27.060420] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:22:07.113 [2024-12-15 05:12:27.060460] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.279 ms 00:22:07.113 [2024-12-15 05:12:27.060476] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.113 [2024-12-15 05:12:27.065116] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.113 [2024-12-15 05:12:27.065141] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:22:07.113 [2024-12-15 05:12:27.065152] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.621 ms 00:22:07.113 [2024-12-15 05:12:27.065161] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.113 [2024-12-15 05:12:27.071299] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.113 [2024-12-15 05:12:27.071333] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:22:07.113 [2024-12-15 05:12:27.071344] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.119 ms 00:22:07.113 [2024-12-15 05:12:27.071359] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.113 [2024-12-15 05:12:27.074267] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.113 [2024-12-15 05:12:27.074315] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:22:07.113 [2024-12-15 05:12:27.074326] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.846 ms 00:22:07.113 [2024-12-15 05:12:27.074335] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.113 [2024-12-15 05:12:27.080056] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.113 [2024-12-15 05:12:27.080113] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:22:07.113 [2024-12-15 05:12:27.080125] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.675 ms 00:22:07.113 [2024-12-15 05:12:27.080137] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.113 [2024-12-15 05:12:27.080277] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.113 [2024-12-15 05:12:27.080288] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:22:07.113 [2024-12-15 05:12:27.080306] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.084 ms 00:22:07.113 [2024-12-15 05:12:27.080318] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.113 [2024-12-15 05:12:27.083552] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.113 [2024-12-15 05:12:27.083593] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:22:07.114 [2024-12-15 05:12:27.083604] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.209 ms 00:22:07.114 [2024-12-15 05:12:27.083612] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.114 [2024-12-15 05:12:27.087094] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.114 [2024-12-15 05:12:27.087133] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:22:07.114 [2024-12-15 05:12:27.087143] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.441 ms 00:22:07.114 [2024-12-15 05:12:27.087151] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.114 [2024-12-15 05:12:27.089634] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.114 [2024-12-15 05:12:27.089675] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:22:07.114 [2024-12-15 05:12:27.089685] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.444 ms 00:22:07.114 [2024-12-15 05:12:27.089692] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.114 [2024-12-15 05:12:27.091837] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.114 [2024-12-15 05:12:27.091877] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:22:07.114 [2024-12-15 05:12:27.091887] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.074 ms 00:22:07.114 [2024-12-15 05:12:27.091894] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.114 [2024-12-15 05:12:27.091933] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:22:07.114 [2024-12-15 05:12:27.091950] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.091971] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.091980] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.091989] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.091998] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092007] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092016] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092025] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092032] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092041] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092050] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092058] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092066] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092073] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092081] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092088] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092096] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092104] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092111] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092119] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092127] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092134] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092142] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092149] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092157] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092179] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092187] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092195] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092203] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092210] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092219] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092227] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092235] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092243] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092252] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092259] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092267] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092275] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092284] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092292] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092300] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092307] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092315] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092323] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092331] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092338] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092346] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092353] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092362] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092370] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092377] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092385] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092393] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092401] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092409] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092416] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092424] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092431] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092456] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092464] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092472] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092480] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092487] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092495] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092505] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092512] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092521] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092528] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092539] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092546] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092564] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092572] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092580] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092588] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092596] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092603] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092611] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:22:07.114 [2024-12-15 05:12:27.092619] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092626] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092634] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092642] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092651] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092659] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092667] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092675] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092683] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092698] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092706] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092714] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092721] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092729] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092737] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092745] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092753] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092760] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092768] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092776] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092784] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092791] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092799] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:22:07.115 [2024-12-15 05:12:27.092815] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:22:07.115 [2024-12-15 05:12:27.092823] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 30f623fd-11b6-47a5-ae87-857377a3ed0b 00:22:07.115 [2024-12-15 05:12:27.092835] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:22:07.115 [2024-12-15 05:12:27.092843] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:22:07.115 [2024-12-15 05:12:27.092851] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:22:07.115 [2024-12-15 05:12:27.092860] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:22:07.115 [2024-12-15 05:12:27.092867] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:22:07.115 [2024-12-15 05:12:27.092875] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:22:07.115 [2024-12-15 05:12:27.092893] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:22:07.115 [2024-12-15 05:12:27.092905] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:22:07.115 [2024-12-15 05:12:27.092912] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:22:07.115 [2024-12-15 05:12:27.092919] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.115 [2024-12-15 05:12:27.092930] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:22:07.115 [2024-12-15 05:12:27.092940] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.987 ms 00:22:07.115 [2024-12-15 05:12:27.092948] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.095320] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.115 [2024-12-15 05:12:27.095356] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:22:07.115 [2024-12-15 05:12:27.095366] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.354 ms 00:22:07.115 [2024-12-15 05:12:27.095375] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.095519] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:07.115 [2024-12-15 05:12:27.095529] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:22:07.115 [2024-12-15 05:12:27.095538] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.113 ms 00:22:07.115 [2024-12-15 05:12:27.095546] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.102874] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:22:07.115 [2024-12-15 05:12:27.102916] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:22:07.115 [2024-12-15 05:12:27.102927] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:22:07.115 [2024-12-15 05:12:27.102945] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.103006] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:22:07.115 [2024-12-15 05:12:27.103015] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:22:07.115 [2024-12-15 05:12:27.103023] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:22:07.115 [2024-12-15 05:12:27.103032] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.103095] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:22:07.115 [2024-12-15 05:12:27.103106] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:22:07.115 [2024-12-15 05:12:27.103118] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:22:07.115 [2024-12-15 05:12:27.103135] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.103154] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:22:07.115 [2024-12-15 05:12:27.103164] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:22:07.115 [2024-12-15 05:12:27.103172] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:22:07.115 [2024-12-15 05:12:27.103181] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.116914] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:22:07.115 [2024-12-15 05:12:27.116964] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:22:07.115 [2024-12-15 05:12:27.116975] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:22:07.115 [2024-12-15 05:12:27.116992] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.128135] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:22:07.115 [2024-12-15 05:12:27.128205] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:22:07.115 [2024-12-15 05:12:27.128218] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:22:07.115 [2024-12-15 05:12:27.128227] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.128285] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:22:07.115 [2024-12-15 05:12:27.128296] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:22:07.115 [2024-12-15 05:12:27.128305] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:22:07.115 [2024-12-15 05:12:27.128314] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.128362] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:22:07.115 [2024-12-15 05:12:27.128375] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:22:07.115 [2024-12-15 05:12:27.128384] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:22:07.115 [2024-12-15 05:12:27.128392] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.128490] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:22:07.115 [2024-12-15 05:12:27.128502] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:22:07.115 [2024-12-15 05:12:27.128510] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:22:07.115 [2024-12-15 05:12:27.128523] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.128554] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:22:07.115 [2024-12-15 05:12:27.128564] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:22:07.115 [2024-12-15 05:12:27.128579] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:22:07.115 [2024-12-15 05:12:27.128587] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.128632] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:22:07.115 [2024-12-15 05:12:27.128642] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:22:07.115 [2024-12-15 05:12:27.128650] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:22:07.115 [2024-12-15 05:12:27.128665] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.128714] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:22:07.115 [2024-12-15 05:12:27.128728] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:22:07.115 [2024-12-15 05:12:27.128736] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:22:07.115 [2024-12-15 05:12:27.128746] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:07.115 [2024-12-15 05:12:27.128880] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 70.907 ms, result 0 00:22:07.376 00:22:07.376 00:22:07.376 05:12:27 ftl.ftl_restore -- ftl/restore.sh@76 -- # md5sum -c /home/vagrant/spdk_repo/spdk/test/ftl/testfile.md5 00:22:09.924 /home/vagrant/spdk_repo/spdk/test/ftl/testfile: OK 00:22:09.924 05:12:29 ftl.ftl_restore -- ftl/restore.sh@79 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/ftl/testfile --ob=ftl0 --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json --seek=131072 00:22:09.924 [2024-12-15 05:12:29.528426] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:22:09.924 [2024-12-15 05:12:29.528570] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid91743 ] 00:22:09.924 [2024-12-15 05:12:29.690275] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:09.924 [2024-12-15 05:12:29.719131] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:22:09.924 [2024-12-15 05:12:29.835695] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:22:09.924 [2024-12-15 05:12:29.835782] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:22:09.924 [2024-12-15 05:12:29.996841] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.924 [2024-12-15 05:12:29.996893] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:22:09.924 [2024-12-15 05:12:29.996912] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:22:09.924 [2024-12-15 05:12:29.996921] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.924 [2024-12-15 05:12:29.996979] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.924 [2024-12-15 05:12:29.996990] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:22:09.924 [2024-12-15 05:12:29.996999] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.039 ms 00:22:09.924 [2024-12-15 05:12:29.997007] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.924 [2024-12-15 05:12:29.997032] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:22:09.924 [2024-12-15 05:12:29.997325] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:22:09.924 [2024-12-15 05:12:29.997343] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.924 [2024-12-15 05:12:29.997356] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:22:09.924 [2024-12-15 05:12:29.997368] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.317 ms 00:22:09.924 [2024-12-15 05:12:29.997375] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.924 [2024-12-15 05:12:29.999094] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 0, shm_clean 0 00:22:09.924 [2024-12-15 05:12:30.003074] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.924 [2024-12-15 05:12:30.003121] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:22:09.924 [2024-12-15 05:12:30.003148] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.981 ms 00:22:09.924 [2024-12-15 05:12:30.003159] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.924 [2024-12-15 05:12:30.003232] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.924 [2024-12-15 05:12:30.003245] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:22:09.924 [2024-12-15 05:12:30.003255] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.027 ms 00:22:09.924 [2024-12-15 05:12:30.003263] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.924 [2024-12-15 05:12:30.011333] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.924 [2024-12-15 05:12:30.011379] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:22:09.924 [2024-12-15 05:12:30.011390] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.026 ms 00:22:09.924 [2024-12-15 05:12:30.011399] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.924 [2024-12-15 05:12:30.011517] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.924 [2024-12-15 05:12:30.011531] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:22:09.924 [2024-12-15 05:12:30.011541] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.092 ms 00:22:09.924 [2024-12-15 05:12:30.011548] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.924 [2024-12-15 05:12:30.011607] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.924 [2024-12-15 05:12:30.011618] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:22:09.924 [2024-12-15 05:12:30.011631] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.007 ms 00:22:09.924 [2024-12-15 05:12:30.011638] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.924 [2024-12-15 05:12:30.011659] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:22:09.924 [2024-12-15 05:12:30.013864] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.924 [2024-12-15 05:12:30.013894] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:22:09.924 [2024-12-15 05:12:30.013904] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.210 ms 00:22:09.924 [2024-12-15 05:12:30.013912] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.924 [2024-12-15 05:12:30.013952] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.924 [2024-12-15 05:12:30.013961] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:22:09.924 [2024-12-15 05:12:30.013973] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.014 ms 00:22:09.924 [2024-12-15 05:12:30.013981] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.924 [2024-12-15 05:12:30.014005] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:22:09.924 [2024-12-15 05:12:30.014028] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:22:09.924 [2024-12-15 05:12:30.014065] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:22:09.924 [2024-12-15 05:12:30.014081] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:22:09.924 [2024-12-15 05:12:30.014188] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:22:09.924 [2024-12-15 05:12:30.014203] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:22:09.924 [2024-12-15 05:12:30.014214] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:22:09.924 [2024-12-15 05:12:30.014224] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:22:09.924 [2024-12-15 05:12:30.014233] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:22:09.924 [2024-12-15 05:12:30.014247] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:22:09.924 [2024-12-15 05:12:30.014256] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:22:09.924 [2024-12-15 05:12:30.014264] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:22:09.924 [2024-12-15 05:12:30.014272] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:22:09.924 [2024-12-15 05:12:30.014280] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.924 [2024-12-15 05:12:30.014287] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:22:09.924 [2024-12-15 05:12:30.014299] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.280 ms 00:22:09.924 [2024-12-15 05:12:30.014312] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.924 [2024-12-15 05:12:30.014395] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.924 [2024-12-15 05:12:30.014404] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:22:09.924 [2024-12-15 05:12:30.014411] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.069 ms 00:22:09.924 [2024-12-15 05:12:30.014418] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.924 [2024-12-15 05:12:30.014537] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:22:09.925 [2024-12-15 05:12:30.014549] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:22:09.925 [2024-12-15 05:12:30.014558] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:22:09.925 [2024-12-15 05:12:30.014576] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:22:09.925 [2024-12-15 05:12:30.014587] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:22:09.925 [2024-12-15 05:12:30.014595] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:22:09.925 [2024-12-15 05:12:30.014603] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:22:09.925 [2024-12-15 05:12:30.014613] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:22:09.925 [2024-12-15 05:12:30.014621] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:22:09.925 [2024-12-15 05:12:30.014629] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:22:09.925 [2024-12-15 05:12:30.014636] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:22:09.925 [2024-12-15 05:12:30.014646] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:22:09.925 [2024-12-15 05:12:30.014654] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:22:09.925 [2024-12-15 05:12:30.014663] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:22:09.925 [2024-12-15 05:12:30.014674] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:22:09.925 [2024-12-15 05:12:30.014682] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:22:09.925 [2024-12-15 05:12:30.014690] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:22:09.925 [2024-12-15 05:12:30.014699] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:22:09.925 [2024-12-15 05:12:30.014706] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:22:09.925 [2024-12-15 05:12:30.014714] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:22:09.925 [2024-12-15 05:12:30.014722] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:22:09.925 [2024-12-15 05:12:30.014730] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:22:09.925 [2024-12-15 05:12:30.014738] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:22:09.925 [2024-12-15 05:12:30.014746] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:22:09.925 [2024-12-15 05:12:30.014753] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:22:09.925 [2024-12-15 05:12:30.014760] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:22:09.925 [2024-12-15 05:12:30.014768] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:22:09.925 [2024-12-15 05:12:30.014783] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:22:09.925 [2024-12-15 05:12:30.014790] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:22:09.925 [2024-12-15 05:12:30.014798] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:22:09.925 [2024-12-15 05:12:30.014805] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:22:09.925 [2024-12-15 05:12:30.014813] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:22:09.925 [2024-12-15 05:12:30.014820] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:22:09.925 [2024-12-15 05:12:30.014828] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:22:09.925 [2024-12-15 05:12:30.014835] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:22:09.925 [2024-12-15 05:12:30.014843] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:22:09.925 [2024-12-15 05:12:30.014850] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:22:09.925 [2024-12-15 05:12:30.014858] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:22:09.925 [2024-12-15 05:12:30.014866] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:22:09.925 [2024-12-15 05:12:30.014874] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:22:09.925 [2024-12-15 05:12:30.014882] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:22:09.925 [2024-12-15 05:12:30.014890] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:22:09.925 [2024-12-15 05:12:30.014897] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:22:09.925 [2024-12-15 05:12:30.014906] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:22:09.925 [2024-12-15 05:12:30.014913] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:22:09.925 [2024-12-15 05:12:30.014921] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:22:09.925 [2024-12-15 05:12:30.014930] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:22:09.925 [2024-12-15 05:12:30.014938] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:22:09.925 [2024-12-15 05:12:30.014945] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:22:09.925 [2024-12-15 05:12:30.014951] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:22:09.925 [2024-12-15 05:12:30.014958] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:22:09.925 [2024-12-15 05:12:30.014965] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:22:09.925 [2024-12-15 05:12:30.014971] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:22:09.925 [2024-12-15 05:12:30.014980] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:22:09.925 [2024-12-15 05:12:30.014989] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:22:09.925 [2024-12-15 05:12:30.014998] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:22:09.925 [2024-12-15 05:12:30.015006] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:22:09.925 [2024-12-15 05:12:30.015013] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:22:09.925 [2024-12-15 05:12:30.015020] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:22:09.925 [2024-12-15 05:12:30.015029] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:22:09.925 [2024-12-15 05:12:30.015036] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:22:09.925 [2024-12-15 05:12:30.015043] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:22:09.925 [2024-12-15 05:12:30.015050] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:22:09.925 [2024-12-15 05:12:30.015057] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:22:09.925 [2024-12-15 05:12:30.015065] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:22:09.925 [2024-12-15 05:12:30.015071] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:22:09.925 [2024-12-15 05:12:30.015078] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:22:09.925 [2024-12-15 05:12:30.015085] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:22:09.925 [2024-12-15 05:12:30.015094] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:22:09.925 [2024-12-15 05:12:30.015102] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:22:09.925 [2024-12-15 05:12:30.015110] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:22:09.925 [2024-12-15 05:12:30.015122] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:22:09.925 [2024-12-15 05:12:30.015129] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:22:09.925 [2024-12-15 05:12:30.015136] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:22:09.925 [2024-12-15 05:12:30.015144] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:22:09.925 [2024-12-15 05:12:30.015153] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.925 [2024-12-15 05:12:30.015161] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:22:09.925 [2024-12-15 05:12:30.015174] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.683 ms 00:22:09.925 [2024-12-15 05:12:30.015182] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.925 [2024-12-15 05:12:30.030621] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.925 [2024-12-15 05:12:30.030671] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:22:09.925 [2024-12-15 05:12:30.030684] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 15.396 ms 00:22:09.925 [2024-12-15 05:12:30.030692] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.925 [2024-12-15 05:12:30.030777] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.925 [2024-12-15 05:12:30.030787] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:22:09.925 [2024-12-15 05:12:30.030797] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.060 ms 00:22:09.925 [2024-12-15 05:12:30.030812] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.925 [2024-12-15 05:12:30.058628] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.925 [2024-12-15 05:12:30.058689] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:22:09.925 [2024-12-15 05:12:30.058707] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 27.753 ms 00:22:09.925 [2024-12-15 05:12:30.058718] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.925 [2024-12-15 05:12:30.058780] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.925 [2024-12-15 05:12:30.058794] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:22:09.925 [2024-12-15 05:12:30.058807] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:22:09.925 [2024-12-15 05:12:30.058825] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.925 [2024-12-15 05:12:30.059489] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.926 [2024-12-15 05:12:30.059531] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:22:09.926 [2024-12-15 05:12:30.059547] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.584 ms 00:22:09.926 [2024-12-15 05:12:30.059559] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:09.926 [2024-12-15 05:12:30.059770] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:09.926 [2024-12-15 05:12:30.059784] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:22:09.926 [2024-12-15 05:12:30.059796] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.174 ms 00:22:09.926 [2024-12-15 05:12:30.059806] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.187 [2024-12-15 05:12:30.068719] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:10.187 [2024-12-15 05:12:30.068764] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:22:10.187 [2024-12-15 05:12:30.068775] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.881 ms 00:22:10.187 [2024-12-15 05:12:30.068782] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.187 [2024-12-15 05:12:30.072648] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 2, empty chunks = 2 00:22:10.187 [2024-12-15 05:12:30.072702] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:22:10.187 [2024-12-15 05:12:30.072715] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:10.187 [2024-12-15 05:12:30.072724] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:22:10.187 [2024-12-15 05:12:30.072733] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.822 ms 00:22:10.187 [2024-12-15 05:12:30.072741] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.187 [2024-12-15 05:12:30.088337] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:10.187 [2024-12-15 05:12:30.088383] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:22:10.187 [2024-12-15 05:12:30.088400] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 15.537 ms 00:22:10.187 [2024-12-15 05:12:30.088408] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.187 [2024-12-15 05:12:30.091100] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:10.187 [2024-12-15 05:12:30.091140] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:22:10.187 [2024-12-15 05:12:30.091150] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.626 ms 00:22:10.187 [2024-12-15 05:12:30.091157] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.187 [2024-12-15 05:12:30.093742] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:10.187 [2024-12-15 05:12:30.093783] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:22:10.187 [2024-12-15 05:12:30.093792] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.539 ms 00:22:10.187 [2024-12-15 05:12:30.093799] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.187 [2024-12-15 05:12:30.094149] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:10.187 [2024-12-15 05:12:30.094161] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:22:10.187 [2024-12-15 05:12:30.094174] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.273 ms 00:22:10.187 [2024-12-15 05:12:30.094185] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.187 [2024-12-15 05:12:30.118700] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:10.187 [2024-12-15 05:12:30.118757] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:22:10.187 [2024-12-15 05:12:30.118770] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 24.498 ms 00:22:10.187 [2024-12-15 05:12:30.118780] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.187 [2024-12-15 05:12:30.127095] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 9 (of 10) MiB 00:22:10.187 [2024-12-15 05:12:30.130528] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:10.187 [2024-12-15 05:12:30.130566] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:22:10.187 [2024-12-15 05:12:30.130579] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 11.696 ms 00:22:10.187 [2024-12-15 05:12:30.130594] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.187 [2024-12-15 05:12:30.130678] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:10.187 [2024-12-15 05:12:30.130690] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:22:10.187 [2024-12-15 05:12:30.130700] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.013 ms 00:22:10.187 [2024-12-15 05:12:30.130709] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.187 [2024-12-15 05:12:30.130784] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:10.187 [2024-12-15 05:12:30.130795] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:22:10.187 [2024-12-15 05:12:30.130804] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.042 ms 00:22:10.187 [2024-12-15 05:12:30.130812] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.187 [2024-12-15 05:12:30.130836] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:10.187 [2024-12-15 05:12:30.130845] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:22:10.187 [2024-12-15 05:12:30.130854] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.009 ms 00:22:10.187 [2024-12-15 05:12:30.130861] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.187 [2024-12-15 05:12:30.130903] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:22:10.187 [2024-12-15 05:12:30.130914] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:10.187 [2024-12-15 05:12:30.130925] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:22:10.187 [2024-12-15 05:12:30.130933] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.014 ms 00:22:10.187 [2024-12-15 05:12:30.130941] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.187 [2024-12-15 05:12:30.136429] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:10.187 [2024-12-15 05:12:30.136502] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:22:10.187 [2024-12-15 05:12:30.136512] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.469 ms 00:22:10.187 [2024-12-15 05:12:30.136520] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.187 [2024-12-15 05:12:30.136603] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:22:10.188 [2024-12-15 05:12:30.136614] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:22:10.188 [2024-12-15 05:12:30.136627] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.037 ms 00:22:10.188 [2024-12-15 05:12:30.136639] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:22:10.188 [2024-12-15 05:12:30.137912] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 140.602 ms, result 0 00:22:11.134  [2024-12-15T05:12:32.219Z] Copying: 10/1024 [MB] (10 MBps) [2024-12-15T05:12:33.233Z] Copying: 23/1024 [MB] (13 MBps) [2024-12-15T05:12:34.176Z] Copying: 64/1024 [MB] (40 MBps) [2024-12-15T05:12:35.564Z] Copying: 94/1024 [MB] (30 MBps) [2024-12-15T05:12:36.508Z] Copying: 126/1024 [MB] (31 MBps) [2024-12-15T05:12:37.452Z] Copying: 136/1024 [MB] (10 MBps) [2024-12-15T05:12:38.395Z] Copying: 156/1024 [MB] (20 MBps) [2024-12-15T05:12:39.337Z] Copying: 170/1024 [MB] (13 MBps) [2024-12-15T05:12:40.279Z] Copying: 181/1024 [MB] (10 MBps) [2024-12-15T05:12:41.221Z] Copying: 191/1024 [MB] (10 MBps) [2024-12-15T05:12:42.164Z] Copying: 201/1024 [MB] (10 MBps) [2024-12-15T05:12:43.551Z] Copying: 226/1024 [MB] (25 MBps) [2024-12-15T05:12:44.495Z] Copying: 268/1024 [MB] (41 MBps) [2024-12-15T05:12:45.439Z] Copying: 310/1024 [MB] (41 MBps) [2024-12-15T05:12:46.381Z] Copying: 348/1024 [MB] (37 MBps) [2024-12-15T05:12:47.325Z] Copying: 358/1024 [MB] (10 MBps) [2024-12-15T05:12:48.269Z] Copying: 369/1024 [MB] (10 MBps) [2024-12-15T05:12:49.212Z] Copying: 398/1024 [MB] (29 MBps) [2024-12-15T05:12:50.156Z] Copying: 429/1024 [MB] (30 MBps) [2024-12-15T05:12:51.544Z] Copying: 440/1024 [MB] (11 MBps) [2024-12-15T05:12:52.487Z] Copying: 457/1024 [MB] (16 MBps) [2024-12-15T05:12:53.431Z] Copying: 473/1024 [MB] (16 MBps) [2024-12-15T05:12:54.376Z] Copying: 493/1024 [MB] (19 MBps) [2024-12-15T05:12:55.320Z] Copying: 507/1024 [MB] (13 MBps) [2024-12-15T05:12:56.262Z] Copying: 520/1024 [MB] (13 MBps) [2024-12-15T05:12:57.206Z] Copying: 539/1024 [MB] (18 MBps) [2024-12-15T05:12:58.150Z] Copying: 557/1024 [MB] (17 MBps) [2024-12-15T05:12:59.573Z] Copying: 573/1024 [MB] (16 MBps) [2024-12-15T05:13:00.517Z] Copying: 584/1024 [MB] (10 MBps) [2024-12-15T05:13:01.460Z] Copying: 594/1024 [MB] (10 MBps) [2024-12-15T05:13:02.403Z] Copying: 604/1024 [MB] (10 MBps) [2024-12-15T05:13:03.347Z] Copying: 615/1024 [MB] (10 MBps) [2024-12-15T05:13:04.291Z] Copying: 625/1024 [MB] (10 MBps) [2024-12-15T05:13:05.234Z] Copying: 635/1024 [MB] (10 MBps) [2024-12-15T05:13:06.176Z] Copying: 646/1024 [MB] (10 MBps) [2024-12-15T05:13:07.564Z] Copying: 656/1024 [MB] (10 MBps) [2024-12-15T05:13:08.506Z] Copying: 678/1024 [MB] (21 MBps) [2024-12-15T05:13:09.450Z] Copying: 702/1024 [MB] (24 MBps) [2024-12-15T05:13:10.393Z] Copying: 722/1024 [MB] (19 MBps) [2024-12-15T05:13:11.337Z] Copying: 735/1024 [MB] (13 MBps) [2024-12-15T05:13:12.279Z] Copying: 752/1024 [MB] (17 MBps) [2024-12-15T05:13:13.222Z] Copying: 777/1024 [MB] (24 MBps) [2024-12-15T05:13:14.162Z] Copying: 798/1024 [MB] (20 MBps) [2024-12-15T05:13:15.547Z] Copying: 812/1024 [MB] (14 MBps) [2024-12-15T05:13:16.492Z] Copying: 829/1024 [MB] (17 MBps) [2024-12-15T05:13:17.435Z] Copying: 840/1024 [MB] (10 MBps) [2024-12-15T05:13:18.375Z] Copying: 860/1024 [MB] (19 MBps) [2024-12-15T05:13:19.320Z] Copying: 870/1024 [MB] (10 MBps) [2024-12-15T05:13:20.263Z] Copying: 881/1024 [MB] (10 MBps) [2024-12-15T05:13:21.206Z] Copying: 896/1024 [MB] (15 MBps) [2024-12-15T05:13:22.150Z] Copying: 909/1024 [MB] (12 MBps) [2024-12-15T05:13:23.537Z] Copying: 919/1024 [MB] (10 MBps) [2024-12-15T05:13:24.480Z] Copying: 937/1024 [MB] (17 MBps) [2024-12-15T05:13:25.482Z] Copying: 957/1024 [MB] (19 MBps) [2024-12-15T05:13:26.424Z] Copying: 970/1024 [MB] (13 MBps) [2024-12-15T05:13:27.368Z] Copying: 989/1024 [MB] (18 MBps) [2024-12-15T05:13:28.310Z] Copying: 1008/1024 [MB] (19 MBps) [2024-12-15T05:13:29.253Z] Copying: 1023/1024 [MB] (15 MBps) [2024-12-15T05:13:29.253Z] Copying: 1024/1024 [MB] (average 17 MBps)[2024-12-15 05:13:29.101349] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.113 [2024-12-15 05:13:29.101656] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:23:09.113 [2024-12-15 05:13:29.101684] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:23:09.113 [2024-12-15 05:13:29.101694] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.113 [2024-12-15 05:13:29.106182] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:23:09.113 [2024-12-15 05:13:29.107844] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.113 [2024-12-15 05:13:29.107915] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:23:09.113 [2024-12-15 05:13:29.107928] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.589 ms 00:23:09.113 [2024-12-15 05:13:29.107937] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.113 [2024-12-15 05:13:29.120297] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.113 [2024-12-15 05:13:29.120504] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:23:09.113 [2024-12-15 05:13:29.120526] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 10.308 ms 00:23:09.113 [2024-12-15 05:13:29.120535] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.113 [2024-12-15 05:13:29.146700] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.113 [2024-12-15 05:13:29.146764] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:23:09.113 [2024-12-15 05:13:29.146777] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 26.140 ms 00:23:09.113 [2024-12-15 05:13:29.146793] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.113 [2024-12-15 05:13:29.152919] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.113 [2024-12-15 05:13:29.152960] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:23:09.113 [2024-12-15 05:13:29.152971] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.089 ms 00:23:09.113 [2024-12-15 05:13:29.152979] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.113 [2024-12-15 05:13:29.155643] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.113 [2024-12-15 05:13:29.155813] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:23:09.113 [2024-12-15 05:13:29.155831] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.612 ms 00:23:09.113 [2024-12-15 05:13:29.155838] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.113 [2024-12-15 05:13:29.160766] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.113 [2024-12-15 05:13:29.160814] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:23:09.113 [2024-12-15 05:13:29.160834] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.855 ms 00:23:09.113 [2024-12-15 05:13:29.160842] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.377 [2024-12-15 05:13:29.409953] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.377 [2024-12-15 05:13:29.410012] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:23:09.377 [2024-12-15 05:13:29.410039] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 249.065 ms 00:23:09.377 [2024-12-15 05:13:29.410047] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.377 [2024-12-15 05:13:29.412719] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.377 [2024-12-15 05:13:29.412767] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:23:09.377 [2024-12-15 05:13:29.412778] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.654 ms 00:23:09.377 [2024-12-15 05:13:29.412787] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.377 [2024-12-15 05:13:29.414734] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.377 [2024-12-15 05:13:29.414776] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:23:09.377 [2024-12-15 05:13:29.414785] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.906 ms 00:23:09.377 [2024-12-15 05:13:29.414792] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.377 [2024-12-15 05:13:29.416792] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.377 [2024-12-15 05:13:29.416841] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:23:09.377 [2024-12-15 05:13:29.416851] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.961 ms 00:23:09.377 [2024-12-15 05:13:29.416858] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.377 [2024-12-15 05:13:29.419078] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.377 [2024-12-15 05:13:29.419122] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:23:09.377 [2024-12-15 05:13:29.419132] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.136 ms 00:23:09.377 [2024-12-15 05:13:29.419140] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.377 [2024-12-15 05:13:29.419176] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:23:09.377 [2024-12-15 05:13:29.419191] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 104704 / 261120 wr_cnt: 1 state: open 00:23:09.377 [2024-12-15 05:13:29.419202] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419211] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419219] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419227] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419235] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419242] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419250] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419258] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419266] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419274] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419281] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419290] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419298] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419306] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419313] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419320] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419328] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419336] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419343] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419352] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419359] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419366] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419373] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419381] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419389] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419399] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419407] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419414] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419422] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419430] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419452] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419460] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419475] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419483] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419492] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419500] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419508] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419516] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419523] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419531] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419539] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419547] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419555] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419562] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419570] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419578] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419585] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419593] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419601] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419609] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419616] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419624] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419631] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419639] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419859] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419867] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419875] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419885] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419893] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419900] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419910] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419918] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419925] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419934] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419942] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419950] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419958] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419966] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419974] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:23:09.377 [2024-12-15 05:13:29.419981] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.419988] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.419996] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420003] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420011] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420018] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420026] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420033] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420040] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420047] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420054] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420062] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420069] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420077] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420084] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420092] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420100] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420107] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420114] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420121] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420129] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420137] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420144] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420152] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420160] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420168] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420175] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420183] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420190] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420210] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:23:09.378 [2024-12-15 05:13:29.420227] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:23:09.378 [2024-12-15 05:13:29.420235] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 30f623fd-11b6-47a5-ae87-857377a3ed0b 00:23:09.378 [2024-12-15 05:13:29.420258] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 104704 00:23:09.378 [2024-12-15 05:13:29.420266] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 105664 00:23:09.378 [2024-12-15 05:13:29.420274] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 104704 00:23:09.378 [2024-12-15 05:13:29.420282] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: 1.0092 00:23:09.378 [2024-12-15 05:13:29.420290] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:23:09.378 [2024-12-15 05:13:29.420298] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:23:09.378 [2024-12-15 05:13:29.420306] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:23:09.378 [2024-12-15 05:13:29.420319] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:23:09.378 [2024-12-15 05:13:29.420325] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:23:09.378 [2024-12-15 05:13:29.420333] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.378 [2024-12-15 05:13:29.420341] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:23:09.378 [2024-12-15 05:13:29.420353] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.158 ms 00:23:09.378 [2024-12-15 05:13:29.420361] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.422612] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.378 [2024-12-15 05:13:29.422645] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:23:09.378 [2024-12-15 05:13:29.422655] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.233 ms 00:23:09.378 [2024-12-15 05:13:29.422663] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.422791] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:09.378 [2024-12-15 05:13:29.422801] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:23:09.378 [2024-12-15 05:13:29.422813] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.099 ms 00:23:09.378 [2024-12-15 05:13:29.422825] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.430055] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:23:09.378 [2024-12-15 05:13:29.430237] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:23:09.378 [2024-12-15 05:13:29.430255] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:23:09.378 [2024-12-15 05:13:29.430264] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.430321] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:23:09.378 [2024-12-15 05:13:29.430331] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:23:09.378 [2024-12-15 05:13:29.430346] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:23:09.378 [2024-12-15 05:13:29.430354] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.430420] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:23:09.378 [2024-12-15 05:13:29.430455] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:23:09.378 [2024-12-15 05:13:29.430464] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:23:09.378 [2024-12-15 05:13:29.430480] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.430499] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:23:09.378 [2024-12-15 05:13:29.430507] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:23:09.378 [2024-12-15 05:13:29.430516] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:23:09.378 [2024-12-15 05:13:29.430524] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.443978] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:23:09.378 [2024-12-15 05:13:29.444030] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:23:09.378 [2024-12-15 05:13:29.444053] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:23:09.378 [2024-12-15 05:13:29.444061] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.455039] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:23:09.378 [2024-12-15 05:13:29.455098] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:23:09.378 [2024-12-15 05:13:29.455114] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:23:09.378 [2024-12-15 05:13:29.455126] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.455183] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:23:09.378 [2024-12-15 05:13:29.455193] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:23:09.378 [2024-12-15 05:13:29.455201] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:23:09.378 [2024-12-15 05:13:29.455214] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.455253] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:23:09.378 [2024-12-15 05:13:29.455264] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:23:09.378 [2024-12-15 05:13:29.455272] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:23:09.378 [2024-12-15 05:13:29.455281] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.455358] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:23:09.378 [2024-12-15 05:13:29.455369] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:23:09.378 [2024-12-15 05:13:29.455378] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:23:09.378 [2024-12-15 05:13:29.455386] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.455425] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:23:09.378 [2024-12-15 05:13:29.455463] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:23:09.378 [2024-12-15 05:13:29.455472] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:23:09.378 [2024-12-15 05:13:29.455480] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.455527] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:23:09.378 [2024-12-15 05:13:29.455537] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:23:09.378 [2024-12-15 05:13:29.455546] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:23:09.378 [2024-12-15 05:13:29.455554] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.455614] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:23:09.378 [2024-12-15 05:13:29.455625] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:23:09.378 [2024-12-15 05:13:29.455633] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:23:09.378 [2024-12-15 05:13:29.455642] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:09.378 [2024-12-15 05:13:29.455787] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 355.966 ms, result 0 00:23:10.389 00:23:10.389 00:23:10.389 05:13:30 ftl.ftl_restore -- ftl/restore.sh@80 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=ftl0 --of=/home/vagrant/spdk_repo/spdk/test/ftl/testfile --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json --skip=131072 --count=262144 00:23:10.389 [2024-12-15 05:13:30.250997] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:23:10.389 [2024-12-15 05:13:30.251154] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid92366 ] 00:23:10.389 [2024-12-15 05:13:30.406791] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:10.389 [2024-12-15 05:13:30.434910] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:23:10.652 [2024-12-15 05:13:30.550337] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:23:10.652 [2024-12-15 05:13:30.550457] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:23:10.652 [2024-12-15 05:13:30.711146] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.652 [2024-12-15 05:13:30.711212] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:23:10.652 [2024-12-15 05:13:30.711227] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:23:10.652 [2024-12-15 05:13:30.711236] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.652 [2024-12-15 05:13:30.711296] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.652 [2024-12-15 05:13:30.711312] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:23:10.652 [2024-12-15 05:13:30.711321] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.040 ms 00:23:10.652 [2024-12-15 05:13:30.711329] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.652 [2024-12-15 05:13:30.711352] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:23:10.652 [2024-12-15 05:13:30.711662] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:23:10.652 [2024-12-15 05:13:30.711681] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.652 [2024-12-15 05:13:30.711689] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:23:10.652 [2024-12-15 05:13:30.711702] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.334 ms 00:23:10.652 [2024-12-15 05:13:30.711713] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.652 [2024-12-15 05:13:30.713371] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 0, shm_clean 0 00:23:10.652 [2024-12-15 05:13:30.717077] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.652 [2024-12-15 05:13:30.717132] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:23:10.652 [2024-12-15 05:13:30.717147] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.713 ms 00:23:10.652 [2024-12-15 05:13:30.717158] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.652 [2024-12-15 05:13:30.717231] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.652 [2024-12-15 05:13:30.717245] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:23:10.652 [2024-12-15 05:13:30.717253] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.025 ms 00:23:10.652 [2024-12-15 05:13:30.717261] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.652 [2024-12-15 05:13:30.725093] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.652 [2024-12-15 05:13:30.725142] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:23:10.652 [2024-12-15 05:13:30.725153] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 7.790 ms 00:23:10.652 [2024-12-15 05:13:30.725166] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.652 [2024-12-15 05:13:30.725259] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.652 [2024-12-15 05:13:30.725268] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:23:10.652 [2024-12-15 05:13:30.725277] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.073 ms 00:23:10.652 [2024-12-15 05:13:30.725285] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.652 [2024-12-15 05:13:30.725338] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.652 [2024-12-15 05:13:30.725348] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:23:10.652 [2024-12-15 05:13:30.725360] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.007 ms 00:23:10.652 [2024-12-15 05:13:30.725371] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.652 [2024-12-15 05:13:30.725398] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:23:10.652 [2024-12-15 05:13:30.727427] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.652 [2024-12-15 05:13:30.727472] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:23:10.652 [2024-12-15 05:13:30.727481] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.035 ms 00:23:10.652 [2024-12-15 05:13:30.727490] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.652 [2024-12-15 05:13:30.727527] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.652 [2024-12-15 05:13:30.727536] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:23:10.652 [2024-12-15 05:13:30.727548] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.013 ms 00:23:10.652 [2024-12-15 05:13:30.727555] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.652 [2024-12-15 05:13:30.727577] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:23:10.652 [2024-12-15 05:13:30.727602] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:23:10.652 [2024-12-15 05:13:30.727642] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:23:10.652 [2024-12-15 05:13:30.727659] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:23:10.652 [2024-12-15 05:13:30.727769] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:23:10.652 [2024-12-15 05:13:30.727783] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:23:10.652 [2024-12-15 05:13:30.727794] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:23:10.652 [2024-12-15 05:13:30.727807] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:23:10.652 [2024-12-15 05:13:30.727816] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:23:10.652 [2024-12-15 05:13:30.727829] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:23:10.652 [2024-12-15 05:13:30.727837] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:23:10.652 [2024-12-15 05:13:30.727845] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:23:10.652 [2024-12-15 05:13:30.727852] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:23:10.652 [2024-12-15 05:13:30.727860] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.652 [2024-12-15 05:13:30.727868] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:23:10.652 [2024-12-15 05:13:30.727880] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.285 ms 00:23:10.652 [2024-12-15 05:13:30.727890] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.652 [2024-12-15 05:13:30.727972] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.652 [2024-12-15 05:13:30.727980] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:23:10.652 [2024-12-15 05:13:30.727991] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.069 ms 00:23:10.652 [2024-12-15 05:13:30.727998] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.652 [2024-12-15 05:13:30.728099] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:23:10.652 [2024-12-15 05:13:30.728111] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:23:10.652 [2024-12-15 05:13:30.728127] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:23:10.652 [2024-12-15 05:13:30.728142] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:23:10.652 [2024-12-15 05:13:30.728153] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:23:10.652 [2024-12-15 05:13:30.728161] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:23:10.652 [2024-12-15 05:13:30.728169] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:23:10.652 [2024-12-15 05:13:30.728177] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:23:10.653 [2024-12-15 05:13:30.728185] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:23:10.653 [2024-12-15 05:13:30.728193] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:23:10.653 [2024-12-15 05:13:30.728228] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:23:10.653 [2024-12-15 05:13:30.728237] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:23:10.653 [2024-12-15 05:13:30.728244] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:23:10.653 [2024-12-15 05:13:30.728254] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:23:10.653 [2024-12-15 05:13:30.728262] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:23:10.653 [2024-12-15 05:13:30.728270] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:23:10.653 [2024-12-15 05:13:30.728279] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:23:10.653 [2024-12-15 05:13:30.728287] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:23:10.653 [2024-12-15 05:13:30.728297] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:23:10.653 [2024-12-15 05:13:30.728306] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:23:10.653 [2024-12-15 05:13:30.728313] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:23:10.653 [2024-12-15 05:13:30.728321] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:23:10.653 [2024-12-15 05:13:30.728329] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:23:10.653 [2024-12-15 05:13:30.728337] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:23:10.653 [2024-12-15 05:13:30.728344] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:23:10.653 [2024-12-15 05:13:30.728352] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:23:10.653 [2024-12-15 05:13:30.728360] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:23:10.653 [2024-12-15 05:13:30.728367] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:23:10.653 [2024-12-15 05:13:30.728375] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:23:10.653 [2024-12-15 05:13:30.728383] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:23:10.653 [2024-12-15 05:13:30.728390] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:23:10.653 [2024-12-15 05:13:30.728399] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:23:10.653 [2024-12-15 05:13:30.728407] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:23:10.653 [2024-12-15 05:13:30.728415] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:23:10.653 [2024-12-15 05:13:30.728425] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:23:10.653 [2024-12-15 05:13:30.728448] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:23:10.653 [2024-12-15 05:13:30.728455] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:23:10.653 [2024-12-15 05:13:30.728462] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:23:10.653 [2024-12-15 05:13:30.728469] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:23:10.653 [2024-12-15 05:13:30.728476] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:23:10.653 [2024-12-15 05:13:30.728484] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:23:10.653 [2024-12-15 05:13:30.728490] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:23:10.653 [2024-12-15 05:13:30.728497] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:23:10.653 [2024-12-15 05:13:30.728503] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:23:10.653 [2024-12-15 05:13:30.728511] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:23:10.653 [2024-12-15 05:13:30.728521] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:23:10.653 [2024-12-15 05:13:30.728529] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:23:10.653 [2024-12-15 05:13:30.728536] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:23:10.653 [2024-12-15 05:13:30.728545] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:23:10.653 [2024-12-15 05:13:30.728552] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:23:10.653 [2024-12-15 05:13:30.728562] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:23:10.653 [2024-12-15 05:13:30.728568] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:23:10.653 [2024-12-15 05:13:30.728576] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:23:10.653 [2024-12-15 05:13:30.728585] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:23:10.653 [2024-12-15 05:13:30.728594] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:23:10.653 [2024-12-15 05:13:30.728603] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:23:10.653 [2024-12-15 05:13:30.728610] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:23:10.653 [2024-12-15 05:13:30.728617] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:23:10.653 [2024-12-15 05:13:30.728626] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:23:10.653 [2024-12-15 05:13:30.728633] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:23:10.653 [2024-12-15 05:13:30.728640] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:23:10.653 [2024-12-15 05:13:30.728647] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:23:10.653 [2024-12-15 05:13:30.728654] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:23:10.653 [2024-12-15 05:13:30.728660] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:23:10.653 [2024-12-15 05:13:30.728668] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:23:10.653 [2024-12-15 05:13:30.728674] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:23:10.653 [2024-12-15 05:13:30.728683] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:23:10.653 [2024-12-15 05:13:30.728690] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:23:10.653 [2024-12-15 05:13:30.728698] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:23:10.653 [2024-12-15 05:13:30.728706] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:23:10.653 [2024-12-15 05:13:30.728714] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:23:10.653 [2024-12-15 05:13:30.728723] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:23:10.653 [2024-12-15 05:13:30.728730] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:23:10.653 [2024-12-15 05:13:30.728737] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:23:10.653 [2024-12-15 05:13:30.728744] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:23:10.653 [2024-12-15 05:13:30.728751] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.653 [2024-12-15 05:13:30.728759] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:23:10.653 [2024-12-15 05:13:30.728771] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.722 ms 00:23:10.653 [2024-12-15 05:13:30.728779] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.653 [2024-12-15 05:13:30.742361] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.653 [2024-12-15 05:13:30.742408] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:23:10.653 [2024-12-15 05:13:30.742421] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 13.537 ms 00:23:10.653 [2024-12-15 05:13:30.742472] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.653 [2024-12-15 05:13:30.742553] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.653 [2024-12-15 05:13:30.742568] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:23:10.653 [2024-12-15 05:13:30.742605] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.060 ms 00:23:10.653 [2024-12-15 05:13:30.742617] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.653 [2024-12-15 05:13:30.764654] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.653 [2024-12-15 05:13:30.764708] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:23:10.653 [2024-12-15 05:13:30.764721] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 21.973 ms 00:23:10.653 [2024-12-15 05:13:30.764730] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.653 [2024-12-15 05:13:30.764778] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.653 [2024-12-15 05:13:30.764797] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:23:10.653 [2024-12-15 05:13:30.764806] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:23:10.653 [2024-12-15 05:13:30.764818] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.653 [2024-12-15 05:13:30.765378] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.653 [2024-12-15 05:13:30.765429] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:23:10.653 [2024-12-15 05:13:30.765471] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.491 ms 00:23:10.653 [2024-12-15 05:13:30.765483] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.653 [2024-12-15 05:13:30.765698] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.653 [2024-12-15 05:13:30.765724] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:23:10.653 [2024-12-15 05:13:30.765737] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.181 ms 00:23:10.653 [2024-12-15 05:13:30.765747] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.653 [2024-12-15 05:13:30.773934] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.653 [2024-12-15 05:13:30.773984] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:23:10.653 [2024-12-15 05:13:30.773996] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.158 ms 00:23:10.653 [2024-12-15 05:13:30.774017] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.653 [2024-12-15 05:13:30.777998] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 4, empty chunks = 0 00:23:10.653 [2024-12-15 05:13:30.778052] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:23:10.653 [2024-12-15 05:13:30.778065] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.653 [2024-12-15 05:13:30.778073] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:23:10.654 [2024-12-15 05:13:30.778082] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.942 ms 00:23:10.654 [2024-12-15 05:13:30.778088] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.914 [2024-12-15 05:13:30.793891] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.914 [2024-12-15 05:13:30.793951] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:23:10.914 [2024-12-15 05:13:30.793963] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 15.751 ms 00:23:10.914 [2024-12-15 05:13:30.793971] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.914 [2024-12-15 05:13:30.796810] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.914 [2024-12-15 05:13:30.796990] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:23:10.914 [2024-12-15 05:13:30.797008] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.788 ms 00:23:10.914 [2024-12-15 05:13:30.797015] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.914 [2024-12-15 05:13:30.799875] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.914 [2024-12-15 05:13:30.799931] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:23:10.914 [2024-12-15 05:13:30.799943] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.485 ms 00:23:10.914 [2024-12-15 05:13:30.799950] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.914 [2024-12-15 05:13:30.800338] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.915 [2024-12-15 05:13:30.800355] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:23:10.915 [2024-12-15 05:13:30.800366] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.306 ms 00:23:10.915 [2024-12-15 05:13:30.800381] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.915 [2024-12-15 05:13:30.825020] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.915 [2024-12-15 05:13:30.825240] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:23:10.915 [2024-12-15 05:13:30.825419] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 24.614 ms 00:23:10.915 [2024-12-15 05:13:30.825477] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.915 [2024-12-15 05:13:30.833614] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 9 (of 10) MiB 00:23:10.915 [2024-12-15 05:13:30.836857] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.915 [2024-12-15 05:13:30.836899] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:23:10.915 [2024-12-15 05:13:30.836911] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 11.319 ms 00:23:10.915 [2024-12-15 05:13:30.836928] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.915 [2024-12-15 05:13:30.837007] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.915 [2024-12-15 05:13:30.837019] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:23:10.915 [2024-12-15 05:13:30.837028] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.014 ms 00:23:10.915 [2024-12-15 05:13:30.837037] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.915 [2024-12-15 05:13:30.838871] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.915 [2024-12-15 05:13:30.838916] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:23:10.915 [2024-12-15 05:13:30.838932] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.797 ms 00:23:10.915 [2024-12-15 05:13:30.838940] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.915 [2024-12-15 05:13:30.838972] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.915 [2024-12-15 05:13:30.838981] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:23:10.915 [2024-12-15 05:13:30.838990] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.009 ms 00:23:10.915 [2024-12-15 05:13:30.838997] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.915 [2024-12-15 05:13:30.839037] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:23:10.915 [2024-12-15 05:13:30.839049] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.915 [2024-12-15 05:13:30.839060] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:23:10.915 [2024-12-15 05:13:30.839068] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.013 ms 00:23:10.915 [2024-12-15 05:13:30.839077] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.915 [2024-12-15 05:13:30.844638] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.915 [2024-12-15 05:13:30.844687] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:23:10.915 [2024-12-15 05:13:30.844698] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.543 ms 00:23:10.915 [2024-12-15 05:13:30.844706] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.915 [2024-12-15 05:13:30.844788] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:23:10.915 [2024-12-15 05:13:30.844797] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:23:10.915 [2024-12-15 05:13:30.844814] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.038 ms 00:23:10.915 [2024-12-15 05:13:30.844827] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:23:10.915 [2024-12-15 05:13:30.846098] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 134.494 ms, result 0 00:23:12.300  [2024-12-15T05:13:33.383Z] Copying: 9672/1048576 [kB] (9672 kBps) [2024-12-15T05:13:34.325Z] Copying: 21/1024 [MB] (11 MBps) [2024-12-15T05:13:35.268Z] Copying: 34/1024 [MB] (13 MBps) [2024-12-15T05:13:36.212Z] Copying: 49/1024 [MB] (15 MBps) [2024-12-15T05:13:37.156Z] Copying: 62/1024 [MB] (12 MBps) [2024-12-15T05:13:38.098Z] Copying: 79/1024 [MB] (17 MBps) [2024-12-15T05:13:39.042Z] Copying: 93/1024 [MB] (14 MBps) [2024-12-15T05:13:40.428Z] Copying: 104/1024 [MB] (10 MBps) [2024-12-15T05:13:41.370Z] Copying: 114/1024 [MB] (10 MBps) [2024-12-15T05:13:42.313Z] Copying: 125/1024 [MB] (10 MBps) [2024-12-15T05:13:43.258Z] Copying: 143/1024 [MB] (18 MBps) [2024-12-15T05:13:44.202Z] Copying: 154/1024 [MB] (10 MBps) [2024-12-15T05:13:45.146Z] Copying: 164/1024 [MB] (10 MBps) [2024-12-15T05:13:46.089Z] Copying: 175/1024 [MB] (10 MBps) [2024-12-15T05:13:47.476Z] Copying: 190/1024 [MB] (15 MBps) [2024-12-15T05:13:48.048Z] Copying: 201/1024 [MB] (10 MBps) [2024-12-15T05:13:49.432Z] Copying: 217/1024 [MB] (16 MBps) [2024-12-15T05:13:50.376Z] Copying: 228/1024 [MB] (10 MBps) [2024-12-15T05:13:51.355Z] Copying: 238/1024 [MB] (10 MBps) [2024-12-15T05:13:52.298Z] Copying: 249/1024 [MB] (10 MBps) [2024-12-15T05:13:53.240Z] Copying: 259/1024 [MB] (10 MBps) [2024-12-15T05:13:54.183Z] Copying: 277/1024 [MB] (18 MBps) [2024-12-15T05:13:55.127Z] Copying: 292/1024 [MB] (14 MBps) [2024-12-15T05:13:56.067Z] Copying: 303/1024 [MB] (11 MBps) [2024-12-15T05:13:57.451Z] Copying: 319/1024 [MB] (15 MBps) [2024-12-15T05:13:58.395Z] Copying: 330/1024 [MB] (11 MBps) [2024-12-15T05:13:59.338Z] Copying: 347/1024 [MB] (17 MBps) [2024-12-15T05:14:00.281Z] Copying: 361/1024 [MB] (13 MBps) [2024-12-15T05:14:01.224Z] Copying: 377/1024 [MB] (15 MBps) [2024-12-15T05:14:02.168Z] Copying: 398/1024 [MB] (21 MBps) [2024-12-15T05:14:03.114Z] Copying: 420/1024 [MB] (21 MBps) [2024-12-15T05:14:04.057Z] Copying: 440/1024 [MB] (20 MBps) [2024-12-15T05:14:05.444Z] Copying: 454/1024 [MB] (14 MBps) [2024-12-15T05:14:06.387Z] Copying: 467/1024 [MB] (13 MBps) [2024-12-15T05:14:07.331Z] Copying: 490/1024 [MB] (22 MBps) [2024-12-15T05:14:08.274Z] Copying: 512/1024 [MB] (22 MBps) [2024-12-15T05:14:09.217Z] Copying: 529/1024 [MB] (17 MBps) [2024-12-15T05:14:10.162Z] Copying: 546/1024 [MB] (16 MBps) [2024-12-15T05:14:11.105Z] Copying: 569/1024 [MB] (22 MBps) [2024-12-15T05:14:12.046Z] Copying: 589/1024 [MB] (20 MBps) [2024-12-15T05:14:13.431Z] Copying: 614/1024 [MB] (24 MBps) [2024-12-15T05:14:14.375Z] Copying: 635/1024 [MB] (21 MBps) [2024-12-15T05:14:15.315Z] Copying: 653/1024 [MB] (17 MBps) [2024-12-15T05:14:16.256Z] Copying: 671/1024 [MB] (18 MBps) [2024-12-15T05:14:17.219Z] Copying: 685/1024 [MB] (14 MBps) [2024-12-15T05:14:18.160Z] Copying: 700/1024 [MB] (14 MBps) [2024-12-15T05:14:19.099Z] Copying: 722/1024 [MB] (21 MBps) [2024-12-15T05:14:20.043Z] Copying: 746/1024 [MB] (24 MBps) [2024-12-15T05:14:21.428Z] Copying: 757/1024 [MB] (11 MBps) [2024-12-15T05:14:22.370Z] Copying: 768/1024 [MB] (11 MBps) [2024-12-15T05:14:23.311Z] Copying: 779/1024 [MB] (11 MBps) [2024-12-15T05:14:24.251Z] Copying: 792/1024 [MB] (12 MBps) [2024-12-15T05:14:25.195Z] Copying: 815/1024 [MB] (22 MBps) [2024-12-15T05:14:26.136Z] Copying: 836/1024 [MB] (20 MBps) [2024-12-15T05:14:27.078Z] Copying: 854/1024 [MB] (18 MBps) [2024-12-15T05:14:28.463Z] Copying: 869/1024 [MB] (14 MBps) [2024-12-15T05:14:29.406Z] Copying: 886/1024 [MB] (16 MBps) [2024-12-15T05:14:30.350Z] Copying: 901/1024 [MB] (15 MBps) [2024-12-15T05:14:31.291Z] Copying: 923/1024 [MB] (21 MBps) [2024-12-15T05:14:32.231Z] Copying: 937/1024 [MB] (14 MBps) [2024-12-15T05:14:33.173Z] Copying: 957/1024 [MB] (20 MBps) [2024-12-15T05:14:34.115Z] Copying: 978/1024 [MB] (21 MBps) [2024-12-15T05:14:35.060Z] Copying: 999/1024 [MB] (20 MBps) [2024-12-15T05:14:35.630Z] Copying: 1019/1024 [MB] (19 MBps) [2024-12-15T05:14:35.891Z] Copying: 1024/1024 [MB] (average 15 MBps)[2024-12-15 05:14:35.804452] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:15.751 [2024-12-15 05:14:35.804527] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:24:15.751 [2024-12-15 05:14:35.804541] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:24:15.751 [2024-12-15 05:14:35.804549] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:15.751 [2024-12-15 05:14:35.804570] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:24:15.751 [2024-12-15 05:14:35.805291] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:15.751 [2024-12-15 05:14:35.805324] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:24:15.751 [2024-12-15 05:14:35.805334] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.707 ms 00:24:15.751 [2024-12-15 05:14:35.805341] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:15.751 [2024-12-15 05:14:35.805564] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:15.751 [2024-12-15 05:14:35.805575] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:24:15.751 [2024-12-15 05:14:35.805584] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.201 ms 00:24:15.751 [2024-12-15 05:14:35.805591] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:15.751 [2024-12-15 05:14:35.809754] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:15.751 [2024-12-15 05:14:35.809797] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:24:15.751 [2024-12-15 05:14:35.809807] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.147 ms 00:24:15.751 [2024-12-15 05:14:35.809821] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:15.751 [2024-12-15 05:14:35.814629] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:15.751 [2024-12-15 05:14:35.814808] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:24:15.751 [2024-12-15 05:14:35.814828] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.770 ms 00:24:15.751 [2024-12-15 05:14:35.814835] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:15.751 [2024-12-15 05:14:35.816452] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:15.751 [2024-12-15 05:14:35.816492] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:24:15.751 [2024-12-15 05:14:35.816500] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.563 ms 00:24:15.751 [2024-12-15 05:14:35.816507] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:15.751 [2024-12-15 05:14:35.821101] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:15.751 [2024-12-15 05:14:35.821145] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:24:15.751 [2024-12-15 05:14:35.821161] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.557 ms 00:24:15.751 [2024-12-15 05:14:35.821167] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.014 [2024-12-15 05:14:35.919401] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:16.014 [2024-12-15 05:14:35.919458] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:24:16.014 [2024-12-15 05:14:35.919467] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 98.195 ms 00:24:16.014 [2024-12-15 05:14:35.919478] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.014 [2024-12-15 05:14:35.921175] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:16.014 [2024-12-15 05:14:35.921283] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:24:16.014 [2024-12-15 05:14:35.921296] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.684 ms 00:24:16.014 [2024-12-15 05:14:35.921302] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.014 [2024-12-15 05:14:35.922637] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:16.014 [2024-12-15 05:14:35.922660] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:24:16.014 [2024-12-15 05:14:35.922667] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.313 ms 00:24:16.014 [2024-12-15 05:14:35.922673] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.014 [2024-12-15 05:14:35.923570] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:16.014 [2024-12-15 05:14:35.923596] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:24:16.014 [2024-12-15 05:14:35.923603] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.874 ms 00:24:16.014 [2024-12-15 05:14:35.923609] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.014 [2024-12-15 05:14:35.924487] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:16.014 [2024-12-15 05:14:35.924513] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:24:16.014 [2024-12-15 05:14:35.924520] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.838 ms 00:24:16.014 [2024-12-15 05:14:35.924525] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.014 [2024-12-15 05:14:35.924547] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:24:16.014 [2024-12-15 05:14:35.924559] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 131072 / 261120 wr_cnt: 1 state: open 00:24:16.014 [2024-12-15 05:14:35.924567] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:24:16.014 [2024-12-15 05:14:35.924573] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:24:16.014 [2024-12-15 05:14:35.924579] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:24:16.014 [2024-12-15 05:14:35.924585] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:24:16.014 [2024-12-15 05:14:35.924591] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:24:16.014 [2024-12-15 05:14:35.924597] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:24:16.014 [2024-12-15 05:14:35.924603] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:24:16.014 [2024-12-15 05:14:35.924609] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:24:16.014 [2024-12-15 05:14:35.924615] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:24:16.014 [2024-12-15 05:14:35.924621] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:24:16.014 [2024-12-15 05:14:35.924627] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:24:16.014 [2024-12-15 05:14:35.924633] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:24:16.014 [2024-12-15 05:14:35.924639] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:24:16.014 [2024-12-15 05:14:35.924645] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924651] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924657] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924662] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924668] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924674] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924679] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924685] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924691] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924696] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924702] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924708] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924722] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924728] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924734] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924740] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924746] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924753] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924758] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924764] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924770] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924776] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924782] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924788] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924793] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924799] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924805] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924811] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924816] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924822] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924828] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924834] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924840] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924845] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924851] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924857] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924863] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924869] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924874] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924880] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924885] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924891] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924897] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924903] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924908] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924914] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924920] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924925] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924931] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924937] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924943] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924949] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924955] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924961] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924967] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924972] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924979] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924984] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924990] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.924996] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925001] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925007] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925013] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925019] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925025] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925031] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925036] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925042] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925047] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925053] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925059] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925064] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925070] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925075] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925082] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925088] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925093] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925099] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925105] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925111] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925117] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925123] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925129] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925134] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925140] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925146] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:24:16.015 [2024-12-15 05:14:35.925158] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:24:16.015 [2024-12-15 05:14:35.925164] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 30f623fd-11b6-47a5-ae87-857377a3ed0b 00:24:16.016 [2024-12-15 05:14:35.925176] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 131072 00:24:16.016 [2024-12-15 05:14:35.925182] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 27328 00:24:16.016 [2024-12-15 05:14:35.925188] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 26368 00:24:16.016 [2024-12-15 05:14:35.925195] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: 1.0364 00:24:16.016 [2024-12-15 05:14:35.925200] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:24:16.016 [2024-12-15 05:14:35.925206] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:24:16.016 [2024-12-15 05:14:35.925212] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:24:16.016 [2024-12-15 05:14:35.925217] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:24:16.016 [2024-12-15 05:14:35.925227] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:24:16.016 [2024-12-15 05:14:35.925232] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:16.016 [2024-12-15 05:14:35.925238] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:24:16.016 [2024-12-15 05:14:35.925250] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.686 ms 00:24:16.016 [2024-12-15 05:14:35.925255] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.926509] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:16.016 [2024-12-15 05:14:35.926528] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:24:16.016 [2024-12-15 05:14:35.926534] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.242 ms 00:24:16.016 [2024-12-15 05:14:35.926540] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.926616] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:16.016 [2024-12-15 05:14:35.926623] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:24:16.016 [2024-12-15 05:14:35.926630] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.053 ms 00:24:16.016 [2024-12-15 05:14:35.926639] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.930841] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:24:16.016 [2024-12-15 05:14:35.930924] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:24:16.016 [2024-12-15 05:14:35.930975] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:24:16.016 [2024-12-15 05:14:35.930993] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.931044] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:24:16.016 [2024-12-15 05:14:35.931112] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:24:16.016 [2024-12-15 05:14:35.931159] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:24:16.016 [2024-12-15 05:14:35.931180] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.931221] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:24:16.016 [2024-12-15 05:14:35.931239] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:24:16.016 [2024-12-15 05:14:35.931254] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:24:16.016 [2024-12-15 05:14:35.931314] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.931334] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:24:16.016 [2024-12-15 05:14:35.931349] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:24:16.016 [2024-12-15 05:14:35.931363] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:24:16.016 [2024-12-15 05:14:35.931404] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.938898] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:24:16.016 [2024-12-15 05:14:35.939007] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:24:16.016 [2024-12-15 05:14:35.939055] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:24:16.016 [2024-12-15 05:14:35.939075] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.945234] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:24:16.016 [2024-12-15 05:14:35.945350] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:24:16.016 [2024-12-15 05:14:35.945396] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:24:16.016 [2024-12-15 05:14:35.945418] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.945470] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:24:16.016 [2024-12-15 05:14:35.945528] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:24:16.016 [2024-12-15 05:14:35.945572] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:24:16.016 [2024-12-15 05:14:35.945589] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.945617] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:24:16.016 [2024-12-15 05:14:35.945634] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:24:16.016 [2024-12-15 05:14:35.945648] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:24:16.016 [2024-12-15 05:14:35.945662] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.945730] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:24:16.016 [2024-12-15 05:14:35.945775] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:24:16.016 [2024-12-15 05:14:35.945791] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:24:16.016 [2024-12-15 05:14:35.945805] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.945844] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:24:16.016 [2024-12-15 05:14:35.945861] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:24:16.016 [2024-12-15 05:14:35.945911] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:24:16.016 [2024-12-15 05:14:35.946137] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.946215] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:24:16.016 [2024-12-15 05:14:35.946258] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:24:16.016 [2024-12-15 05:14:35.946311] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:24:16.016 [2024-12-15 05:14:35.946329] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.946377] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:24:16.016 [2024-12-15 05:14:35.946399] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:24:16.016 [2024-12-15 05:14:35.946458] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:24:16.016 [2024-12-15 05:14:35.946478] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:16.016 [2024-12-15 05:14:35.946634] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 142.182 ms, result 0 00:24:16.016 00:24:16.016 00:24:16.016 05:14:36 ftl.ftl_restore -- ftl/restore.sh@82 -- # md5sum -c /home/vagrant/spdk_repo/spdk/test/ftl/testfile.md5 00:24:18.564 /home/vagrant/spdk_repo/spdk/test/ftl/testfile: OK 00:24:18.564 05:14:38 ftl.ftl_restore -- ftl/restore.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:24:18.564 05:14:38 ftl.ftl_restore -- ftl/restore.sh@85 -- # restore_kill 00:24:18.564 05:14:38 ftl.ftl_restore -- ftl/restore.sh@28 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/testfile 00:24:18.564 05:14:38 ftl.ftl_restore -- ftl/restore.sh@29 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/testfile.md5 00:24:18.564 05:14:38 ftl.ftl_restore -- ftl/restore.sh@30 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:24:18.564 05:14:38 ftl.ftl_restore -- ftl/restore.sh@32 -- # killprocess 90280 00:24:18.564 05:14:38 ftl.ftl_restore -- common/autotest_common.sh@954 -- # '[' -z 90280 ']' 00:24:18.564 05:14:38 ftl.ftl_restore -- common/autotest_common.sh@958 -- # kill -0 90280 00:24:18.564 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 958: kill: (90280) - No such process 00:24:18.564 Process with pid 90280 is not found 00:24:18.564 05:14:38 ftl.ftl_restore -- common/autotest_common.sh@981 -- # echo 'Process with pid 90280 is not found' 00:24:18.564 05:14:38 ftl.ftl_restore -- ftl/restore.sh@33 -- # remove_shm 00:24:18.564 Remove shared memory files 00:24:18.564 05:14:38 ftl.ftl_restore -- ftl/common.sh@204 -- # echo Remove shared memory files 00:24:18.564 05:14:38 ftl.ftl_restore -- ftl/common.sh@205 -- # rm -f rm -f 00:24:18.564 05:14:38 ftl.ftl_restore -- ftl/common.sh@206 -- # rm -f rm -f 00:24:18.564 05:14:38 ftl.ftl_restore -- ftl/common.sh@207 -- # rm -f rm -f 00:24:18.564 05:14:38 ftl.ftl_restore -- ftl/common.sh@208 -- # rm -f rm -f /dev/shm/iscsi 00:24:18.564 05:14:38 ftl.ftl_restore -- ftl/common.sh@209 -- # rm -f rm -f 00:24:18.564 ************************************ 00:24:18.564 END TEST ftl_restore 00:24:18.564 ************************************ 00:24:18.564 00:24:18.564 real 4m28.717s 00:24:18.564 user 4m15.512s 00:24:18.564 sys 0m12.831s 00:24:18.564 05:14:38 ftl.ftl_restore -- common/autotest_common.sh@1130 -- # xtrace_disable 00:24:18.564 05:14:38 ftl.ftl_restore -- common/autotest_common.sh@10 -- # set +x 00:24:18.564 05:14:38 ftl -- ftl/ftl.sh@77 -- # run_test ftl_dirty_shutdown /home/vagrant/spdk_repo/spdk/test/ftl/dirty_shutdown.sh -c 0000:00:10.0 0000:00:11.0 00:24:18.564 05:14:38 ftl -- common/autotest_common.sh@1105 -- # '[' 5 -le 1 ']' 00:24:18.564 05:14:38 ftl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:24:18.564 05:14:38 ftl -- common/autotest_common.sh@10 -- # set +x 00:24:18.564 ************************************ 00:24:18.564 START TEST ftl_dirty_shutdown 00:24:18.564 ************************************ 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/ftl/dirty_shutdown.sh -c 0000:00:10.0 0000:00:11.0 00:24:18.564 * Looking for test storage... 00:24:18.564 * Found test storage at /home/vagrant/spdk_repo/spdk/test/ftl 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1711 -- # lcov --version 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@333 -- # local ver1 ver1_l 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@334 -- # local ver2 ver2_l 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@336 -- # IFS=.-: 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@336 -- # read -ra ver1 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@337 -- # IFS=.-: 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@337 -- # read -ra ver2 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@338 -- # local 'op=<' 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@340 -- # ver1_l=2 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@341 -- # ver2_l=1 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@344 -- # case "$op" in 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@345 -- # : 1 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@364 -- # (( v = 0 )) 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@365 -- # decimal 1 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@353 -- # local d=1 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@355 -- # echo 1 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@365 -- # ver1[v]=1 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@366 -- # decimal 2 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@353 -- # local d=2 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@355 -- # echo 2 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@366 -- # ver2[v]=2 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- scripts/common.sh@368 -- # return 0 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:24:18.564 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:24:18.564 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:24:18.564 --rc genhtml_branch_coverage=1 00:24:18.564 --rc genhtml_function_coverage=1 00:24:18.564 --rc genhtml_legend=1 00:24:18.564 --rc geninfo_all_blocks=1 00:24:18.564 --rc geninfo_unexecuted_blocks=1 00:24:18.565 00:24:18.565 ' 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:24:18.565 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:24:18.565 --rc genhtml_branch_coverage=1 00:24:18.565 --rc genhtml_function_coverage=1 00:24:18.565 --rc genhtml_legend=1 00:24:18.565 --rc geninfo_all_blocks=1 00:24:18.565 --rc geninfo_unexecuted_blocks=1 00:24:18.565 00:24:18.565 ' 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:24:18.565 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:24:18.565 --rc genhtml_branch_coverage=1 00:24:18.565 --rc genhtml_function_coverage=1 00:24:18.565 --rc genhtml_legend=1 00:24:18.565 --rc geninfo_all_blocks=1 00:24:18.565 --rc geninfo_unexecuted_blocks=1 00:24:18.565 00:24:18.565 ' 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:24:18.565 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:24:18.565 --rc genhtml_branch_coverage=1 00:24:18.565 --rc genhtml_function_coverage=1 00:24:18.565 --rc genhtml_legend=1 00:24:18.565 --rc geninfo_all_blocks=1 00:24:18.565 --rc geninfo_unexecuted_blocks=1 00:24:18.565 00:24:18.565 ' 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/ftl/common.sh 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@8 -- # dirname /home/vagrant/spdk_repo/spdk/test/ftl/dirty_shutdown.sh 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@8 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@8 -- # testdir=/home/vagrant/spdk_repo/spdk/test/ftl 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@9 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl/../.. 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@9 -- # rootdir=/home/vagrant/spdk_repo/spdk 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@10 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@12 -- # export 'ftl_tgt_core_mask=[0]' 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@12 -- # ftl_tgt_core_mask='[0]' 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@14 -- # export spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@14 -- # spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@15 -- # export 'spdk_tgt_cpumask=[0]' 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@15 -- # spdk_tgt_cpumask='[0]' 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@16 -- # export spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@16 -- # spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@17 -- # export spdk_tgt_pid= 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@17 -- # spdk_tgt_pid= 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@19 -- # export spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@19 -- # spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@20 -- # export 'spdk_ini_cpumask=[1]' 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@20 -- # spdk_ini_cpumask='[1]' 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@21 -- # export spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@21 -- # spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@22 -- # export spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@22 -- # spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@23 -- # export spdk_ini_pid= 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@23 -- # spdk_ini_pid= 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@25 -- # export spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/common.sh@25 -- # spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@11 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@12 -- # spdk_dd=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@14 -- # getopts :u:c: opt 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@15 -- # case $opt in 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@17 -- # nv_cache=0000:00:10.0 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@14 -- # getopts :u:c: opt 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@21 -- # shift 2 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@23 -- # device=0000:00:11.0 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@24 -- # timeout=240 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@26 -- # block_size=4096 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@27 -- # chunk_size=262144 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@28 -- # data_size=262144 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@42 -- # trap 'restore_kill; exit 1' SIGINT SIGTERM EXIT 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@45 -- # svcpid=93129 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@44 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt -m 0x1 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@47 -- # waitforlisten 93129 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@835 -- # '[' -z 93129 ']' 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@840 -- # local max_retries=100 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:24:18.565 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@844 -- # xtrace_disable 00:24:18.565 05:14:38 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@10 -- # set +x 00:24:18.826 [2024-12-15 05:14:38.739731] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:24:18.826 [2024-12-15 05:14:38.740097] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid93129 ] 00:24:18.826 [2024-12-15 05:14:38.900529] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:18.826 [2024-12-15 05:14:38.929282] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:24:19.398 05:14:39 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:24:19.398 05:14:39 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@868 -- # return 0 00:24:19.659 05:14:39 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@49 -- # create_base_bdev nvme0 0000:00:11.0 103424 00:24:19.659 05:14:39 ftl.ftl_dirty_shutdown -- ftl/common.sh@54 -- # local name=nvme0 00:24:19.659 05:14:39 ftl.ftl_dirty_shutdown -- ftl/common.sh@55 -- # local base_bdf=0000:00:11.0 00:24:19.659 05:14:39 ftl.ftl_dirty_shutdown -- ftl/common.sh@56 -- # local size=103424 00:24:19.659 05:14:39 ftl.ftl_dirty_shutdown -- ftl/common.sh@59 -- # local base_bdev 00:24:19.659 05:14:39 ftl.ftl_dirty_shutdown -- ftl/common.sh@60 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t PCIe -a 0000:00:11.0 00:24:19.659 05:14:39 ftl.ftl_dirty_shutdown -- ftl/common.sh@60 -- # base_bdev=nvme0n1 00:24:19.659 05:14:39 ftl.ftl_dirty_shutdown -- ftl/common.sh@62 -- # local base_size 00:24:19.659 05:14:39 ftl.ftl_dirty_shutdown -- ftl/common.sh@63 -- # get_bdev_size nvme0n1 00:24:19.659 05:14:39 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1382 -- # local bdev_name=nvme0n1 00:24:19.659 05:14:39 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1383 -- # local bdev_info 00:24:19.659 05:14:39 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1384 -- # local bs 00:24:19.659 05:14:39 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1385 -- # local nb 00:24:19.660 05:14:39 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b nvme0n1 00:24:19.920 05:14:39 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:24:19.920 { 00:24:19.920 "name": "nvme0n1", 00:24:19.920 "aliases": [ 00:24:19.920 "73ddd388-ad7f-40cf-9794-254f557e104a" 00:24:19.920 ], 00:24:19.920 "product_name": "NVMe disk", 00:24:19.920 "block_size": 4096, 00:24:19.920 "num_blocks": 1310720, 00:24:19.920 "uuid": "73ddd388-ad7f-40cf-9794-254f557e104a", 00:24:19.920 "numa_id": -1, 00:24:19.920 "assigned_rate_limits": { 00:24:19.920 "rw_ios_per_sec": 0, 00:24:19.920 "rw_mbytes_per_sec": 0, 00:24:19.920 "r_mbytes_per_sec": 0, 00:24:19.920 "w_mbytes_per_sec": 0 00:24:19.920 }, 00:24:19.920 "claimed": true, 00:24:19.920 "claim_type": "read_many_write_one", 00:24:19.920 "zoned": false, 00:24:19.920 "supported_io_types": { 00:24:19.920 "read": true, 00:24:19.920 "write": true, 00:24:19.920 "unmap": true, 00:24:19.920 "flush": true, 00:24:19.920 "reset": true, 00:24:19.920 "nvme_admin": true, 00:24:19.920 "nvme_io": true, 00:24:19.920 "nvme_io_md": false, 00:24:19.920 "write_zeroes": true, 00:24:19.920 "zcopy": false, 00:24:19.920 "get_zone_info": false, 00:24:19.920 "zone_management": false, 00:24:19.920 "zone_append": false, 00:24:19.920 "compare": true, 00:24:19.920 "compare_and_write": false, 00:24:19.920 "abort": true, 00:24:19.920 "seek_hole": false, 00:24:19.920 "seek_data": false, 00:24:19.920 "copy": true, 00:24:19.920 "nvme_iov_md": false 00:24:19.920 }, 00:24:19.920 "driver_specific": { 00:24:19.920 "nvme": [ 00:24:19.920 { 00:24:19.920 "pci_address": "0000:00:11.0", 00:24:19.920 "trid": { 00:24:19.920 "trtype": "PCIe", 00:24:19.920 "traddr": "0000:00:11.0" 00:24:19.920 }, 00:24:19.920 "ctrlr_data": { 00:24:19.920 "cntlid": 0, 00:24:19.920 "vendor_id": "0x1b36", 00:24:19.920 "model_number": "QEMU NVMe Ctrl", 00:24:19.920 "serial_number": "12341", 00:24:19.920 "firmware_revision": "8.0.0", 00:24:19.920 "subnqn": "nqn.2019-08.org.qemu:12341", 00:24:19.920 "oacs": { 00:24:19.920 "security": 0, 00:24:19.920 "format": 1, 00:24:19.920 "firmware": 0, 00:24:19.920 "ns_manage": 1 00:24:19.920 }, 00:24:19.920 "multi_ctrlr": false, 00:24:19.920 "ana_reporting": false 00:24:19.921 }, 00:24:19.921 "vs": { 00:24:19.921 "nvme_version": "1.4" 00:24:19.921 }, 00:24:19.921 "ns_data": { 00:24:19.921 "id": 1, 00:24:19.921 "can_share": false 00:24:19.921 } 00:24:19.921 } 00:24:19.921 ], 00:24:19.921 "mp_policy": "active_passive" 00:24:19.921 } 00:24:19.921 } 00:24:19.921 ]' 00:24:19.921 05:14:39 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:24:19.921 05:14:40 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1387 -- # bs=4096 00:24:19.921 05:14:40 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:24:19.921 05:14:40 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1388 -- # nb=1310720 00:24:19.921 05:14:40 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1391 -- # bdev_size=5120 00:24:19.921 05:14:40 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1392 -- # echo 5120 00:24:19.921 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@63 -- # base_size=5120 00:24:19.921 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@64 -- # [[ 103424 -le 5120 ]] 00:24:19.921 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@67 -- # clear_lvols 00:24:19.921 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_get_lvstores 00:24:19.921 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@28 -- # jq -r '.[] | .uuid' 00:24:20.182 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@28 -- # stores=e7561374-28d2-4ab0-b451-50ad9d9ca932 00:24:20.182 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@29 -- # for lvs in $stores 00:24:20.182 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@30 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -u e7561374-28d2-4ab0-b451-50ad9d9ca932 00:24:20.443 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@68 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_create_lvstore nvme0n1 lvs 00:24:20.704 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@68 -- # lvs=ae21f3e8-031a-4d3c-a3cb-0384416cd2f7 00:24:20.704 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@69 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_create nvme0n1p0 103424 -t -u ae21f3e8-031a-4d3c-a3cb-0384416cd2f7 00:24:20.965 05:14:40 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@49 -- # split_bdev=139d1201-e66a-4355-88e0-55a17f15ea5a 00:24:20.965 05:14:40 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@51 -- # '[' -n 0000:00:10.0 ']' 00:24:20.965 05:14:40 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@52 -- # create_nv_cache_bdev nvc0 0000:00:10.0 139d1201-e66a-4355-88e0-55a17f15ea5a 00:24:20.965 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@35 -- # local name=nvc0 00:24:20.965 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@36 -- # local cache_bdf=0000:00:10.0 00:24:20.965 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@37 -- # local base_bdev=139d1201-e66a-4355-88e0-55a17f15ea5a 00:24:20.965 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@38 -- # local cache_size= 00:24:20.965 05:14:40 ftl.ftl_dirty_shutdown -- ftl/common.sh@41 -- # get_bdev_size 139d1201-e66a-4355-88e0-55a17f15ea5a 00:24:20.965 05:14:40 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1382 -- # local bdev_name=139d1201-e66a-4355-88e0-55a17f15ea5a 00:24:20.965 05:14:40 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1383 -- # local bdev_info 00:24:20.965 05:14:40 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1384 -- # local bs 00:24:20.965 05:14:40 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1385 -- # local nb 00:24:20.965 05:14:40 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b 139d1201-e66a-4355-88e0-55a17f15ea5a 00:24:20.965 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:24:20.965 { 00:24:20.965 "name": "139d1201-e66a-4355-88e0-55a17f15ea5a", 00:24:20.965 "aliases": [ 00:24:20.965 "lvs/nvme0n1p0" 00:24:20.965 ], 00:24:20.965 "product_name": "Logical Volume", 00:24:20.965 "block_size": 4096, 00:24:20.965 "num_blocks": 26476544, 00:24:20.965 "uuid": "139d1201-e66a-4355-88e0-55a17f15ea5a", 00:24:20.965 "assigned_rate_limits": { 00:24:20.965 "rw_ios_per_sec": 0, 00:24:20.965 "rw_mbytes_per_sec": 0, 00:24:20.965 "r_mbytes_per_sec": 0, 00:24:20.965 "w_mbytes_per_sec": 0 00:24:20.965 }, 00:24:20.965 "claimed": false, 00:24:20.965 "zoned": false, 00:24:20.965 "supported_io_types": { 00:24:20.965 "read": true, 00:24:20.965 "write": true, 00:24:20.965 "unmap": true, 00:24:20.965 "flush": false, 00:24:20.965 "reset": true, 00:24:20.965 "nvme_admin": false, 00:24:20.965 "nvme_io": false, 00:24:20.965 "nvme_io_md": false, 00:24:20.965 "write_zeroes": true, 00:24:20.965 "zcopy": false, 00:24:20.965 "get_zone_info": false, 00:24:20.965 "zone_management": false, 00:24:20.965 "zone_append": false, 00:24:20.965 "compare": false, 00:24:20.965 "compare_and_write": false, 00:24:20.965 "abort": false, 00:24:20.965 "seek_hole": true, 00:24:20.965 "seek_data": true, 00:24:20.965 "copy": false, 00:24:20.965 "nvme_iov_md": false 00:24:20.965 }, 00:24:20.965 "driver_specific": { 00:24:20.965 "lvol": { 00:24:20.965 "lvol_store_uuid": "ae21f3e8-031a-4d3c-a3cb-0384416cd2f7", 00:24:20.965 "base_bdev": "nvme0n1", 00:24:20.965 "thin_provision": true, 00:24:20.965 "num_allocated_clusters": 0, 00:24:20.965 "snapshot": false, 00:24:20.965 "clone": false, 00:24:20.965 "esnap_clone": false 00:24:20.965 } 00:24:20.965 } 00:24:20.965 } 00:24:20.965 ]' 00:24:20.965 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:24:20.965 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1387 -- # bs=4096 00:24:20.965 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:24:21.226 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1388 -- # nb=26476544 00:24:21.226 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:24:21.226 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1392 -- # echo 103424 00:24:21.226 05:14:41 ftl.ftl_dirty_shutdown -- ftl/common.sh@41 -- # local base_size=5171 00:24:21.226 05:14:41 ftl.ftl_dirty_shutdown -- ftl/common.sh@44 -- # local nvc_bdev 00:24:21.226 05:14:41 ftl.ftl_dirty_shutdown -- ftl/common.sh@45 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvc0 -t PCIe -a 0000:00:10.0 00:24:21.226 05:14:41 ftl.ftl_dirty_shutdown -- ftl/common.sh@45 -- # nvc_bdev=nvc0n1 00:24:21.226 05:14:41 ftl.ftl_dirty_shutdown -- ftl/common.sh@47 -- # [[ -z '' ]] 00:24:21.226 05:14:41 ftl.ftl_dirty_shutdown -- ftl/common.sh@48 -- # get_bdev_size 139d1201-e66a-4355-88e0-55a17f15ea5a 00:24:21.226 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1382 -- # local bdev_name=139d1201-e66a-4355-88e0-55a17f15ea5a 00:24:21.226 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1383 -- # local bdev_info 00:24:21.226 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1384 -- # local bs 00:24:21.226 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1385 -- # local nb 00:24:21.226 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b 139d1201-e66a-4355-88e0-55a17f15ea5a 00:24:21.487 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:24:21.487 { 00:24:21.487 "name": "139d1201-e66a-4355-88e0-55a17f15ea5a", 00:24:21.487 "aliases": [ 00:24:21.487 "lvs/nvme0n1p0" 00:24:21.487 ], 00:24:21.487 "product_name": "Logical Volume", 00:24:21.487 "block_size": 4096, 00:24:21.487 "num_blocks": 26476544, 00:24:21.487 "uuid": "139d1201-e66a-4355-88e0-55a17f15ea5a", 00:24:21.487 "assigned_rate_limits": { 00:24:21.487 "rw_ios_per_sec": 0, 00:24:21.487 "rw_mbytes_per_sec": 0, 00:24:21.487 "r_mbytes_per_sec": 0, 00:24:21.487 "w_mbytes_per_sec": 0 00:24:21.487 }, 00:24:21.487 "claimed": false, 00:24:21.487 "zoned": false, 00:24:21.487 "supported_io_types": { 00:24:21.487 "read": true, 00:24:21.487 "write": true, 00:24:21.487 "unmap": true, 00:24:21.487 "flush": false, 00:24:21.487 "reset": true, 00:24:21.487 "nvme_admin": false, 00:24:21.487 "nvme_io": false, 00:24:21.487 "nvme_io_md": false, 00:24:21.487 "write_zeroes": true, 00:24:21.487 "zcopy": false, 00:24:21.487 "get_zone_info": false, 00:24:21.487 "zone_management": false, 00:24:21.487 "zone_append": false, 00:24:21.487 "compare": false, 00:24:21.487 "compare_and_write": false, 00:24:21.487 "abort": false, 00:24:21.487 "seek_hole": true, 00:24:21.487 "seek_data": true, 00:24:21.487 "copy": false, 00:24:21.487 "nvme_iov_md": false 00:24:21.487 }, 00:24:21.487 "driver_specific": { 00:24:21.488 "lvol": { 00:24:21.488 "lvol_store_uuid": "ae21f3e8-031a-4d3c-a3cb-0384416cd2f7", 00:24:21.488 "base_bdev": "nvme0n1", 00:24:21.488 "thin_provision": true, 00:24:21.488 "num_allocated_clusters": 0, 00:24:21.488 "snapshot": false, 00:24:21.488 "clone": false, 00:24:21.488 "esnap_clone": false 00:24:21.488 } 00:24:21.488 } 00:24:21.488 } 00:24:21.488 ]' 00:24:21.488 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:24:21.488 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1387 -- # bs=4096 00:24:21.488 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:24:21.488 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1388 -- # nb=26476544 00:24:21.488 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:24:21.488 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1392 -- # echo 103424 00:24:21.488 05:14:41 ftl.ftl_dirty_shutdown -- ftl/common.sh@48 -- # cache_size=5171 00:24:21.488 05:14:41 ftl.ftl_dirty_shutdown -- ftl/common.sh@50 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_split_create nvc0n1 -s 5171 1 00:24:21.748 05:14:41 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@52 -- # nvc_bdev=nvc0n1p0 00:24:21.748 05:14:41 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@55 -- # get_bdev_size 139d1201-e66a-4355-88e0-55a17f15ea5a 00:24:21.748 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1382 -- # local bdev_name=139d1201-e66a-4355-88e0-55a17f15ea5a 00:24:21.748 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1383 -- # local bdev_info 00:24:21.748 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1384 -- # local bs 00:24:21.748 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1385 -- # local nb 00:24:21.748 05:14:41 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b 139d1201-e66a-4355-88e0-55a17f15ea5a 00:24:22.009 05:14:42 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:24:22.009 { 00:24:22.009 "name": "139d1201-e66a-4355-88e0-55a17f15ea5a", 00:24:22.009 "aliases": [ 00:24:22.009 "lvs/nvme0n1p0" 00:24:22.009 ], 00:24:22.009 "product_name": "Logical Volume", 00:24:22.009 "block_size": 4096, 00:24:22.009 "num_blocks": 26476544, 00:24:22.009 "uuid": "139d1201-e66a-4355-88e0-55a17f15ea5a", 00:24:22.009 "assigned_rate_limits": { 00:24:22.009 "rw_ios_per_sec": 0, 00:24:22.009 "rw_mbytes_per_sec": 0, 00:24:22.009 "r_mbytes_per_sec": 0, 00:24:22.009 "w_mbytes_per_sec": 0 00:24:22.009 }, 00:24:22.009 "claimed": false, 00:24:22.009 "zoned": false, 00:24:22.009 "supported_io_types": { 00:24:22.009 "read": true, 00:24:22.009 "write": true, 00:24:22.009 "unmap": true, 00:24:22.009 "flush": false, 00:24:22.009 "reset": true, 00:24:22.009 "nvme_admin": false, 00:24:22.009 "nvme_io": false, 00:24:22.010 "nvme_io_md": false, 00:24:22.010 "write_zeroes": true, 00:24:22.010 "zcopy": false, 00:24:22.010 "get_zone_info": false, 00:24:22.010 "zone_management": false, 00:24:22.010 "zone_append": false, 00:24:22.010 "compare": false, 00:24:22.010 "compare_and_write": false, 00:24:22.010 "abort": false, 00:24:22.010 "seek_hole": true, 00:24:22.010 "seek_data": true, 00:24:22.010 "copy": false, 00:24:22.010 "nvme_iov_md": false 00:24:22.010 }, 00:24:22.010 "driver_specific": { 00:24:22.010 "lvol": { 00:24:22.010 "lvol_store_uuid": "ae21f3e8-031a-4d3c-a3cb-0384416cd2f7", 00:24:22.010 "base_bdev": "nvme0n1", 00:24:22.010 "thin_provision": true, 00:24:22.010 "num_allocated_clusters": 0, 00:24:22.010 "snapshot": false, 00:24:22.010 "clone": false, 00:24:22.010 "esnap_clone": false 00:24:22.010 } 00:24:22.010 } 00:24:22.010 } 00:24:22.010 ]' 00:24:22.010 05:14:42 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:24:22.010 05:14:42 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1387 -- # bs=4096 00:24:22.010 05:14:42 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:24:22.010 05:14:42 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1388 -- # nb=26476544 00:24:22.010 05:14:42 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:24:22.010 05:14:42 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1392 -- # echo 103424 00:24:22.010 05:14:42 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@55 -- # l2p_dram_size_mb=10 00:24:22.010 05:14:42 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@56 -- # ftl_construct_args='bdev_ftl_create -b ftl0 -d 139d1201-e66a-4355-88e0-55a17f15ea5a --l2p_dram_limit 10' 00:24:22.010 05:14:42 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@58 -- # '[' -n '' ']' 00:24:22.010 05:14:42 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@59 -- # '[' -n 0000:00:10.0 ']' 00:24:22.010 05:14:42 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@59 -- # ftl_construct_args+=' -c nvc0n1p0' 00:24:22.010 05:14:42 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@61 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -t 240 bdev_ftl_create -b ftl0 -d 139d1201-e66a-4355-88e0-55a17f15ea5a --l2p_dram_limit 10 -c nvc0n1p0 00:24:22.272 [2024-12-15 05:14:42.218610] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:22.272 [2024-12-15 05:14:42.218649] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:24:22.272 [2024-12-15 05:14:42.218660] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:24:22.272 [2024-12-15 05:14:42.218672] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:22.272 [2024-12-15 05:14:42.218717] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:22.272 [2024-12-15 05:14:42.218726] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:24:22.272 [2024-12-15 05:14:42.218735] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.031 ms 00:24:22.272 [2024-12-15 05:14:42.218743] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:22.272 [2024-12-15 05:14:42.218761] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:24:22.272 [2024-12-15 05:14:42.218979] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:24:22.272 [2024-12-15 05:14:42.218991] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:22.272 [2024-12-15 05:14:42.219003] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:24:22.272 [2024-12-15 05:14:42.219009] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.237 ms 00:24:22.272 [2024-12-15 05:14:42.219017] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:22.272 [2024-12-15 05:14:42.219039] mngt/ftl_mngt_md.c: 570:ftl_mngt_superblock_init: *NOTICE*: [FTL][ftl0] Create new FTL, UUID a3f0cc82-b3b1-4076-8939-aceeb9079b03 00:24:22.272 [2024-12-15 05:14:42.220096] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:22.272 [2024-12-15 05:14:42.220120] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Default-initialize superblock 00:24:22.272 [2024-12-15 05:14:42.220129] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.022 ms 00:24:22.272 [2024-12-15 05:14:42.220135] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:22.272 [2024-12-15 05:14:42.224912] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:22.272 [2024-12-15 05:14:42.224943] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:24:22.272 [2024-12-15 05:14:42.224952] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.717 ms 00:24:22.272 [2024-12-15 05:14:42.224958] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:22.272 [2024-12-15 05:14:42.225019] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:22.272 [2024-12-15 05:14:42.225026] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:24:22.272 [2024-12-15 05:14:42.225034] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.041 ms 00:24:22.272 [2024-12-15 05:14:42.225039] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:22.272 [2024-12-15 05:14:42.225073] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:22.272 [2024-12-15 05:14:42.225081] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:24:22.272 [2024-12-15 05:14:42.225088] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.010 ms 00:24:22.272 [2024-12-15 05:14:42.225094] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:22.272 [2024-12-15 05:14:42.225112] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:24:22.272 [2024-12-15 05:14:42.226367] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:22.272 [2024-12-15 05:14:42.226395] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:24:22.272 [2024-12-15 05:14:42.226402] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.262 ms 00:24:22.272 [2024-12-15 05:14:42.226413] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:22.272 [2024-12-15 05:14:42.226448] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:22.272 [2024-12-15 05:14:42.226457] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:24:22.272 [2024-12-15 05:14:42.226463] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.017 ms 00:24:22.272 [2024-12-15 05:14:42.226471] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:22.272 [2024-12-15 05:14:42.226491] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 1 00:24:22.272 [2024-12-15 05:14:42.226612] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:24:22.272 [2024-12-15 05:14:42.226621] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:24:22.272 [2024-12-15 05:14:42.226631] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:24:22.272 [2024-12-15 05:14:42.226639] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:24:22.272 [2024-12-15 05:14:42.226649] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:24:22.272 [2024-12-15 05:14:42.226658] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:24:22.272 [2024-12-15 05:14:42.226666] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:24:22.272 [2024-12-15 05:14:42.226672] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:24:22.272 [2024-12-15 05:14:42.226679] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:24:22.272 [2024-12-15 05:14:42.226685] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:22.272 [2024-12-15 05:14:42.226692] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:24:22.272 [2024-12-15 05:14:42.226699] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.195 ms 00:24:22.272 [2024-12-15 05:14:42.226706] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:22.272 [2024-12-15 05:14:42.226771] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:22.272 [2024-12-15 05:14:42.226780] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:24:22.272 [2024-12-15 05:14:42.226786] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.053 ms 00:24:22.272 [2024-12-15 05:14:42.226796] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:22.272 [2024-12-15 05:14:42.226866] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:24:22.272 [2024-12-15 05:14:42.226875] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:24:22.272 [2024-12-15 05:14:42.226881] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:24:22.272 [2024-12-15 05:14:42.226888] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:24:22.272 [2024-12-15 05:14:42.226894] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:24:22.273 [2024-12-15 05:14:42.226900] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:24:22.273 [2024-12-15 05:14:42.226905] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:24:22.273 [2024-12-15 05:14:42.226911] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:24:22.273 [2024-12-15 05:14:42.226916] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:24:22.273 [2024-12-15 05:14:42.226923] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:24:22.273 [2024-12-15 05:14:42.226928] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:24:22.273 [2024-12-15 05:14:42.226935] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:24:22.273 [2024-12-15 05:14:42.226940] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:24:22.273 [2024-12-15 05:14:42.226948] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:24:22.273 [2024-12-15 05:14:42.226953] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:24:22.273 [2024-12-15 05:14:42.226960] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:24:22.273 [2024-12-15 05:14:42.226965] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:24:22.273 [2024-12-15 05:14:42.226972] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:24:22.273 [2024-12-15 05:14:42.226976] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:24:22.273 [2024-12-15 05:14:42.226983] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:24:22.273 [2024-12-15 05:14:42.226988] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:24:22.273 [2024-12-15 05:14:42.226995] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:24:22.273 [2024-12-15 05:14:42.227000] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:24:22.273 [2024-12-15 05:14:42.227006] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:24:22.273 [2024-12-15 05:14:42.227011] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:24:22.273 [2024-12-15 05:14:42.227018] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:24:22.273 [2024-12-15 05:14:42.227023] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:24:22.273 [2024-12-15 05:14:42.227030] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:24:22.273 [2024-12-15 05:14:42.227036] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:24:22.273 [2024-12-15 05:14:42.227044] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:24:22.273 [2024-12-15 05:14:42.227050] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:24:22.273 [2024-12-15 05:14:42.227057] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:24:22.273 [2024-12-15 05:14:42.227063] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:24:22.273 [2024-12-15 05:14:42.227070] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:24:22.273 [2024-12-15 05:14:42.227076] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:24:22.273 [2024-12-15 05:14:42.227083] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:24:22.273 [2024-12-15 05:14:42.227088] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:24:22.273 [2024-12-15 05:14:42.227095] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:24:22.273 [2024-12-15 05:14:42.227101] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:24:22.273 [2024-12-15 05:14:42.227108] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:24:22.273 [2024-12-15 05:14:42.227113] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:24:22.273 [2024-12-15 05:14:42.227120] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:24:22.273 [2024-12-15 05:14:42.227126] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:24:22.273 [2024-12-15 05:14:42.227133] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:24:22.273 [2024-12-15 05:14:42.227139] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:24:22.273 [2024-12-15 05:14:42.227148] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:24:22.273 [2024-12-15 05:14:42.227154] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:24:22.273 [2024-12-15 05:14:42.227163] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:24:22.273 [2024-12-15 05:14:42.227169] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:24:22.273 [2024-12-15 05:14:42.227176] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:24:22.273 [2024-12-15 05:14:42.227182] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:24:22.273 [2024-12-15 05:14:42.227190] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:24:22.273 [2024-12-15 05:14:42.227197] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:24:22.273 [2024-12-15 05:14:42.227206] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:24:22.273 [2024-12-15 05:14:42.227217] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:24:22.273 [2024-12-15 05:14:42.227225] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:24:22.273 [2024-12-15 05:14:42.227232] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:24:22.273 [2024-12-15 05:14:42.227239] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:24:22.273 [2024-12-15 05:14:42.227245] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:24:22.273 [2024-12-15 05:14:42.227252] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:24:22.273 [2024-12-15 05:14:42.227258] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:24:22.273 [2024-12-15 05:14:42.227267] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:24:22.273 [2024-12-15 05:14:42.227274] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:24:22.273 [2024-12-15 05:14:42.227281] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:24:22.273 [2024-12-15 05:14:42.227287] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:24:22.273 [2024-12-15 05:14:42.227294] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:24:22.273 [2024-12-15 05:14:42.227300] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:24:22.273 [2024-12-15 05:14:42.227307] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:24:22.273 [2024-12-15 05:14:42.227314] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:24:22.273 [2024-12-15 05:14:42.227321] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:24:22.273 [2024-12-15 05:14:42.227329] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:24:22.273 [2024-12-15 05:14:42.227336] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:24:22.273 [2024-12-15 05:14:42.227342] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:24:22.273 [2024-12-15 05:14:42.227350] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:24:22.273 [2024-12-15 05:14:42.227356] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:24:22.273 [2024-12-15 05:14:42.227365] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:22.273 [2024-12-15 05:14:42.227371] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:24:22.273 [2024-12-15 05:14:42.227380] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.547 ms 00:24:22.273 [2024-12-15 05:14:42.227386] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:22.273 [2024-12-15 05:14:42.227417] mngt/ftl_mngt_misc.c: 165:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl0] NV cache data region needs scrubbing, this may take a while. 00:24:22.273 [2024-12-15 05:14:42.227424] mngt/ftl_mngt_misc.c: 166:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl0] Scrubbing 5 chunks 00:24:25.575 [2024-12-15 05:14:45.678132] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.575 [2024-12-15 05:14:45.678228] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Scrub NV cache 00:24:25.575 [2024-12-15 05:14:45.678250] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3450.692 ms 00:24:25.575 [2024-12-15 05:14:45.678260] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.575 [2024-12-15 05:14:45.691683] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.575 [2024-12-15 05:14:45.691738] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:24:25.575 [2024-12-15 05:14:45.691761] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 13.288 ms 00:24:25.575 [2024-12-15 05:14:45.691770] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.575 [2024-12-15 05:14:45.691899] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.575 [2024-12-15 05:14:45.691910] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:24:25.575 [2024-12-15 05:14:45.691922] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.072 ms 00:24:25.575 [2024-12-15 05:14:45.691930] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.575 [2024-12-15 05:14:45.704744] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.575 [2024-12-15 05:14:45.704958] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:24:25.575 [2024-12-15 05:14:45.704983] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 12.766 ms 00:24:25.575 [2024-12-15 05:14:45.704996] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.575 [2024-12-15 05:14:45.705034] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.575 [2024-12-15 05:14:45.705043] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:24:25.575 [2024-12-15 05:14:45.705054] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:24:25.575 [2024-12-15 05:14:45.705062] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.575 [2024-12-15 05:14:45.705666] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.575 [2024-12-15 05:14:45.705698] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:24:25.575 [2024-12-15 05:14:45.705713] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.547 ms 00:24:25.575 [2024-12-15 05:14:45.705722] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.575 [2024-12-15 05:14:45.705851] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.575 [2024-12-15 05:14:45.705862] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:24:25.575 [2024-12-15 05:14:45.705874] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.096 ms 00:24:25.575 [2024-12-15 05:14:45.705883] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.836 [2024-12-15 05:14:45.714184] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.836 [2024-12-15 05:14:45.714234] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:24:25.836 [2024-12-15 05:14:45.714247] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.275 ms 00:24:25.836 [2024-12-15 05:14:45.714255] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.836 [2024-12-15 05:14:45.732596] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 9 (of 10) MiB 00:24:25.836 [2024-12-15 05:14:45.736525] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.836 [2024-12-15 05:14:45.736575] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:24:25.836 [2024-12-15 05:14:45.736590] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 22.198 ms 00:24:25.836 [2024-12-15 05:14:45.736603] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.836 [2024-12-15 05:14:45.828284] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.836 [2024-12-15 05:14:45.828558] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear L2P 00:24:25.836 [2024-12-15 05:14:45.828585] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 91.632 ms 00:24:25.836 [2024-12-15 05:14:45.828599] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.836 [2024-12-15 05:14:45.828801] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.836 [2024-12-15 05:14:45.828822] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:24:25.836 [2024-12-15 05:14:45.828831] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.155 ms 00:24:25.836 [2024-12-15 05:14:45.828842] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.836 [2024-12-15 05:14:45.834807] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.836 [2024-12-15 05:14:45.834866] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Save initial band info metadata 00:24:25.836 [2024-12-15 05:14:45.834881] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.929 ms 00:24:25.836 [2024-12-15 05:14:45.834892] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.836 [2024-12-15 05:14:45.839769] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.836 [2024-12-15 05:14:45.839954] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Save initial chunk info metadata 00:24:25.836 [2024-12-15 05:14:45.839973] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.830 ms 00:24:25.836 [2024-12-15 05:14:45.839983] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.836 [2024-12-15 05:14:45.840375] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.836 [2024-12-15 05:14:45.840393] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:24:25.836 [2024-12-15 05:14:45.840403] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.317 ms 00:24:25.836 [2024-12-15 05:14:45.840416] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.836 [2024-12-15 05:14:45.886760] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.836 [2024-12-15 05:14:45.886951] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Wipe P2L region 00:24:25.836 [2024-12-15 05:14:45.886974] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 46.291 ms 00:24:25.836 [2024-12-15 05:14:45.886985] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.836 [2024-12-15 05:14:45.893746] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.836 [2024-12-15 05:14:45.893917] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear trim map 00:24:25.836 [2024-12-15 05:14:45.893935] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.668 ms 00:24:25.836 [2024-12-15 05:14:45.893946] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.836 [2024-12-15 05:14:45.899498] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.836 [2024-12-15 05:14:45.899550] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear trim log 00:24:25.836 [2024-12-15 05:14:45.899560] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.482 ms 00:24:25.836 [2024-12-15 05:14:45.899570] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.836 [2024-12-15 05:14:45.905598] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.836 [2024-12-15 05:14:45.905654] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:24:25.836 [2024-12-15 05:14:45.905665] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.983 ms 00:24:25.836 [2024-12-15 05:14:45.905677] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.836 [2024-12-15 05:14:45.905726] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.836 [2024-12-15 05:14:45.905746] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:24:25.836 [2024-12-15 05:14:45.905756] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.007 ms 00:24:25.836 [2024-12-15 05:14:45.905766] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.836 [2024-12-15 05:14:45.905837] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:24:25.836 [2024-12-15 05:14:45.905849] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:24:25.836 [2024-12-15 05:14:45.905858] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.037 ms 00:24:25.836 [2024-12-15 05:14:45.905871] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:24:25.836 [2024-12-15 05:14:45.906954] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 3687.887 ms, result 0 00:24:25.836 { 00:24:25.836 "name": "ftl0", 00:24:25.836 "uuid": "a3f0cc82-b3b1-4076-8939-aceeb9079b03" 00:24:25.836 } 00:24:25.836 05:14:45 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@64 -- # echo '{"subsystems": [' 00:24:25.836 05:14:45 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@65 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py save_subsystem_config -n bdev 00:24:26.097 05:14:46 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@66 -- # echo ']}' 00:24:26.097 05:14:46 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@70 -- # modprobe nbd 00:24:26.097 05:14:46 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@71 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py nbd_start_disk ftl0 /dev/nbd0 00:24:26.358 /dev/nbd0 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@72 -- # waitfornbd nbd0 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@872 -- # local nbd_name=nbd0 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@873 -- # local i 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@875 -- # (( i = 1 )) 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@875 -- # (( i <= 20 )) 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@876 -- # grep -q -w nbd0 /proc/partitions 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@877 -- # break 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@888 -- # (( i = 1 )) 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@888 -- # (( i <= 20 )) 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@889 -- # dd if=/dev/nbd0 of=/home/vagrant/spdk_repo/spdk/test/ftl/nbdtest bs=4096 count=1 iflag=direct 00:24:26.358 1+0 records in 00:24:26.358 1+0 records out 00:24:26.358 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000513161 s, 8.0 MB/s 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@890 -- # stat -c %s /home/vagrant/spdk_repo/spdk/test/ftl/nbdtest 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@890 -- # size=4096 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@891 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/nbdtest 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@892 -- # '[' 4096 '!=' 0 ']' 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@893 -- # return 0 00:24:26.358 05:14:46 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@75 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd -m 0x2 --if=/dev/urandom --of=/home/vagrant/spdk_repo/spdk/test/ftl/testfile --bs=4096 --count=262144 00:24:26.358 [2024-12-15 05:14:46.468086] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:24:26.358 [2024-12-15 05:14:46.468235] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid93262 ] 00:24:26.619 [2024-12-15 05:14:46.628558] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:26.619 [2024-12-15 05:14:46.657834] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:24:28.003  [2024-12-15T05:14:49.086Z] Copying: 191/1024 [MB] (191 MBps) [2024-12-15T05:14:50.028Z] Copying: 381/1024 [MB] (190 MBps) [2024-12-15T05:14:50.967Z] Copying: 606/1024 [MB] (225 MBps) [2024-12-15T05:14:51.533Z] Copying: 865/1024 [MB] (258 MBps) [2024-12-15T05:14:51.533Z] Copying: 1024/1024 [MB] (average 221 MBps) 00:24:31.393 00:24:31.393 05:14:51 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@76 -- # md5sum /home/vagrant/spdk_repo/spdk/test/ftl/testfile 00:24:33.922 05:14:53 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@77 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd -m 0x2 --if=/home/vagrant/spdk_repo/spdk/test/ftl/testfile --of=/dev/nbd0 --bs=4096 --count=262144 --oflag=direct 00:24:33.922 [2024-12-15 05:14:53.714281] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:24:33.922 [2024-12-15 05:14:53.714572] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid93343 ] 00:24:33.922 [2024-12-15 05:14:53.862365] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:33.922 [2024-12-15 05:14:53.878904] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:24:34.862  [2024-12-15T05:14:55.945Z] Copying: 24/1024 [MB] (24 MBps) [2024-12-15T05:14:57.329Z] Copying: 48/1024 [MB] (23 MBps) [2024-12-15T05:14:58.272Z] Copying: 71/1024 [MB] (23 MBps) [2024-12-15T05:14:59.212Z] Copying: 94/1024 [MB] (22 MBps) [2024-12-15T05:15:00.151Z] Copying: 117/1024 [MB] (23 MBps) [2024-12-15T05:15:01.087Z] Copying: 141/1024 [MB] (23 MBps) [2024-12-15T05:15:02.021Z] Copying: 168/1024 [MB] (27 MBps) [2024-12-15T05:15:03.030Z] Copying: 203/1024 [MB] (35 MBps) [2024-12-15T05:15:03.973Z] Copying: 230/1024 [MB] (26 MBps) [2024-12-15T05:15:05.363Z] Copying: 252/1024 [MB] (22 MBps) [2024-12-15T05:15:05.935Z] Copying: 278/1024 [MB] (25 MBps) [2024-12-15T05:15:07.335Z] Copying: 302/1024 [MB] (23 MBps) [2024-12-15T05:15:08.275Z] Copying: 328/1024 [MB] (26 MBps) [2024-12-15T05:15:09.209Z] Copying: 342/1024 [MB] (14 MBps) [2024-12-15T05:15:10.144Z] Copying: 361/1024 [MB] (18 MBps) [2024-12-15T05:15:11.078Z] Copying: 385/1024 [MB] (24 MBps) [2024-12-15T05:15:12.012Z] Copying: 416/1024 [MB] (31 MBps) [2024-12-15T05:15:12.946Z] Copying: 433/1024 [MB] (16 MBps) [2024-12-15T05:15:14.320Z] Copying: 452/1024 [MB] (18 MBps) [2024-12-15T05:15:15.263Z] Copying: 470/1024 [MB] (17 MBps) [2024-12-15T05:15:16.199Z] Copying: 489/1024 [MB] (19 MBps) [2024-12-15T05:15:17.133Z] Copying: 510/1024 [MB] (20 MBps) [2024-12-15T05:15:18.067Z] Copying: 532/1024 [MB] (22 MBps) [2024-12-15T05:15:19.001Z] Copying: 553/1024 [MB] (20 MBps) [2024-12-15T05:15:19.935Z] Copying: 572/1024 [MB] (19 MBps) [2024-12-15T05:15:21.308Z] Copying: 593/1024 [MB] (20 MBps) [2024-12-15T05:15:22.243Z] Copying: 617/1024 [MB] (24 MBps) [2024-12-15T05:15:23.177Z] Copying: 642/1024 [MB] (25 MBps) [2024-12-15T05:15:24.111Z] Copying: 662/1024 [MB] (20 MBps) [2024-12-15T05:15:25.044Z] Copying: 696/1024 [MB] (34 MBps) [2024-12-15T05:15:26.074Z] Copying: 727/1024 [MB] (30 MBps) [2024-12-15T05:15:27.008Z] Copying: 745/1024 [MB] (18 MBps) [2024-12-15T05:15:27.942Z] Copying: 771/1024 [MB] (25 MBps) [2024-12-15T05:15:29.316Z] Copying: 801/1024 [MB] (29 MBps) [2024-12-15T05:15:30.249Z] Copying: 820/1024 [MB] (19 MBps) [2024-12-15T05:15:31.192Z] Copying: 839/1024 [MB] (18 MBps) [2024-12-15T05:15:32.126Z] Copying: 860/1024 [MB] (21 MBps) [2024-12-15T05:15:33.060Z] Copying: 879/1024 [MB] (19 MBps) [2024-12-15T05:15:33.994Z] Copying: 899/1024 [MB] (20 MBps) [2024-12-15T05:15:34.927Z] Copying: 918/1024 [MB] (19 MBps) [2024-12-15T05:15:36.301Z] Copying: 939/1024 [MB] (20 MBps) [2024-12-15T05:15:37.234Z] Copying: 958/1024 [MB] (19 MBps) [2024-12-15T05:15:38.168Z] Copying: 978/1024 [MB] (19 MBps) [2024-12-15T05:15:39.101Z] Copying: 996/1024 [MB] (18 MBps) [2024-12-15T05:15:39.361Z] Copying: 1016/1024 [MB] (19 MBps) [2024-12-15T05:15:39.361Z] Copying: 1024/1024 [MB] (average 22 MBps) 00:25:19.221 00:25:19.221 05:15:39 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@78 -- # sync /dev/nbd0 00:25:19.221 05:15:39 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@79 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py nbd_stop_disk /dev/nbd0 00:25:19.480 05:15:39 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@80 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_unload -b ftl0 00:25:19.742 [2024-12-15 05:15:39.743067] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.742 [2024-12-15 05:15:39.743107] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:25:19.742 [2024-12-15 05:15:39.743121] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.002 ms 00:25:19.742 [2024-12-15 05:15:39.743127] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.742 [2024-12-15 05:15:39.743148] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:25:19.742 [2024-12-15 05:15:39.743576] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.742 [2024-12-15 05:15:39.743600] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:25:19.742 [2024-12-15 05:15:39.743607] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.417 ms 00:25:19.742 [2024-12-15 05:15:39.743615] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.742 [2024-12-15 05:15:39.746088] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.742 [2024-12-15 05:15:39.746117] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:25:19.742 [2024-12-15 05:15:39.746125] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.456 ms 00:25:19.742 [2024-12-15 05:15:39.746133] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.742 [2024-12-15 05:15:39.761834] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.742 [2024-12-15 05:15:39.761865] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:25:19.742 [2024-12-15 05:15:39.761876] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 15.687 ms 00:25:19.742 [2024-12-15 05:15:39.761883] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.742 [2024-12-15 05:15:39.766655] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.742 [2024-12-15 05:15:39.766681] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:25:19.742 [2024-12-15 05:15:39.766695] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.746 ms 00:25:19.742 [2024-12-15 05:15:39.766702] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.742 [2024-12-15 05:15:39.767930] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.742 [2024-12-15 05:15:39.767970] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:25:19.742 [2024-12-15 05:15:39.767978] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.180 ms 00:25:19.742 [2024-12-15 05:15:39.767985] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.742 [2024-12-15 05:15:39.772016] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.742 [2024-12-15 05:15:39.772053] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:25:19.742 [2024-12-15 05:15:39.772061] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.006 ms 00:25:19.742 [2024-12-15 05:15:39.772069] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.742 [2024-12-15 05:15:39.772172] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.743 [2024-12-15 05:15:39.772185] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:25:19.743 [2024-12-15 05:15:39.772192] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.076 ms 00:25:19.743 [2024-12-15 05:15:39.772203] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.743 [2024-12-15 05:15:39.773912] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.743 [2024-12-15 05:15:39.773943] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:25:19.743 [2024-12-15 05:15:39.773949] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.697 ms 00:25:19.743 [2024-12-15 05:15:39.773956] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.743 [2024-12-15 05:15:39.775460] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.743 [2024-12-15 05:15:39.775492] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:25:19.743 [2024-12-15 05:15:39.775499] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.479 ms 00:25:19.743 [2024-12-15 05:15:39.775506] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.743 [2024-12-15 05:15:39.776708] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.743 [2024-12-15 05:15:39.776739] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:25:19.743 [2024-12-15 05:15:39.776746] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.176 ms 00:25:19.743 [2024-12-15 05:15:39.776752] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.743 [2024-12-15 05:15:39.777756] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.743 [2024-12-15 05:15:39.777785] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:25:19.743 [2024-12-15 05:15:39.777792] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.959 ms 00:25:19.743 [2024-12-15 05:15:39.777799] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.743 [2024-12-15 05:15:39.777823] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:25:19.743 [2024-12-15 05:15:39.777838] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777845] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777852] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777857] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777866] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777872] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777879] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777884] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777892] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777897] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777906] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777912] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777918] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777924] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777931] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777936] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777943] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777948] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777956] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777962] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777970] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777976] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777982] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777988] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.777994] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778000] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778006] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778012] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778018] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778024] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778032] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778038] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778045] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778051] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778057] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778063] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778072] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778078] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778085] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778090] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778097] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778103] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778109] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778115] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778121] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778126] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778133] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778139] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778146] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778156] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778163] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778168] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778177] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778182] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778189] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778195] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778201] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778207] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778213] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778219] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778226] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778231] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778245] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778250] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778258] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778263] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778270] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778275] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778284] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778290] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778297] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778302] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778309] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:25:19.743 [2024-12-15 05:15:39.778314] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778321] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778327] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778333] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778339] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778346] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778352] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778358] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778364] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778371] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778377] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778385] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778391] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778398] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778403] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778411] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778421] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778428] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778442] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778450] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778455] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778463] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778468] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778475] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778481] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778488] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778494] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:25:19.744 [2024-12-15 05:15:39.778508] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:25:19.744 [2024-12-15 05:15:39.778515] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: a3f0cc82-b3b1-4076-8939-aceeb9079b03 00:25:19.744 [2024-12-15 05:15:39.778522] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:25:19.744 [2024-12-15 05:15:39.778528] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:25:19.744 [2024-12-15 05:15:39.778534] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:25:19.744 [2024-12-15 05:15:39.778540] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:25:19.744 [2024-12-15 05:15:39.778547] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:25:19.744 [2024-12-15 05:15:39.778552] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:25:19.744 [2024-12-15 05:15:39.778559] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:25:19.744 [2024-12-15 05:15:39.778564] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:25:19.744 [2024-12-15 05:15:39.778570] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:25:19.744 [2024-12-15 05:15:39.778575] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.744 [2024-12-15 05:15:39.778582] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:25:19.744 [2024-12-15 05:15:39.778589] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.753 ms 00:25:19.744 [2024-12-15 05:15:39.778597] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.779850] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.744 [2024-12-15 05:15:39.779875] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:25:19.744 [2024-12-15 05:15:39.779883] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.240 ms 00:25:19.744 [2024-12-15 05:15:39.779891] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.779959] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:19.744 [2024-12-15 05:15:39.779969] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:25:19.744 [2024-12-15 05:15:39.779975] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.054 ms 00:25:19.744 [2024-12-15 05:15:39.779981] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.784419] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:25:19.744 [2024-12-15 05:15:39.784453] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:25:19.744 [2024-12-15 05:15:39.784460] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:25:19.744 [2024-12-15 05:15:39.784468] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.784509] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:25:19.744 [2024-12-15 05:15:39.784519] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:25:19.744 [2024-12-15 05:15:39.784527] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:25:19.744 [2024-12-15 05:15:39.784535] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.784587] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:25:19.744 [2024-12-15 05:15:39.784599] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:25:19.744 [2024-12-15 05:15:39.784605] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:25:19.744 [2024-12-15 05:15:39.784612] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.784624] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:25:19.744 [2024-12-15 05:15:39.784632] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:25:19.744 [2024-12-15 05:15:39.784639] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:25:19.744 [2024-12-15 05:15:39.784646] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.792591] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:25:19.744 [2024-12-15 05:15:39.792631] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:25:19.744 [2024-12-15 05:15:39.792639] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:25:19.744 [2024-12-15 05:15:39.792647] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.799140] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:25:19.744 [2024-12-15 05:15:39.799175] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:25:19.744 [2024-12-15 05:15:39.799185] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:25:19.744 [2024-12-15 05:15:39.799196] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.799233] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:25:19.744 [2024-12-15 05:15:39.799244] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:25:19.744 [2024-12-15 05:15:39.799250] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:25:19.744 [2024-12-15 05:15:39.799257] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.799296] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:25:19.744 [2024-12-15 05:15:39.799305] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:25:19.744 [2024-12-15 05:15:39.799311] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:25:19.744 [2024-12-15 05:15:39.799320] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.799372] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:25:19.744 [2024-12-15 05:15:39.799381] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:25:19.744 [2024-12-15 05:15:39.799387] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:25:19.744 [2024-12-15 05:15:39.799394] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.799417] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:25:19.744 [2024-12-15 05:15:39.799425] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:25:19.744 [2024-12-15 05:15:39.799442] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:25:19.744 [2024-12-15 05:15:39.799450] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.799480] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:25:19.744 [2024-12-15 05:15:39.799490] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:25:19.744 [2024-12-15 05:15:39.799496] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:25:19.744 [2024-12-15 05:15:39.799503] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.799535] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:25:19.744 [2024-12-15 05:15:39.799544] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:25:19.744 [2024-12-15 05:15:39.799550] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:25:19.744 [2024-12-15 05:15:39.799559] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:19.744 [2024-12-15 05:15:39.799660] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 56.565 ms, result 0 00:25:19.744 true 00:25:19.744 05:15:39 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@83 -- # kill -9 93129 00:25:19.744 05:15:39 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@84 -- # rm -f /dev/shm/spdk_tgt_trace.pid93129 00:25:19.744 05:15:39 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@87 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/dev/urandom --of=/home/vagrant/spdk_repo/spdk/test/ftl/testfile2 --bs=4096 --count=262144 00:25:19.744 [2024-12-15 05:15:39.870191] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:25:19.744 [2024-12-15 05:15:39.870320] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid93823 ] 00:25:20.006 [2024-12-15 05:15:40.024870] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:20.006 [2024-12-15 05:15:40.052578] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:25:21.393  [2024-12-15T05:15:42.478Z] Copying: 199/1024 [MB] (199 MBps) [2024-12-15T05:15:43.420Z] Copying: 457/1024 [MB] (257 MBps) [2024-12-15T05:15:44.359Z] Copying: 718/1024 [MB] (260 MBps) [2024-12-15T05:15:44.359Z] Copying: 975/1024 [MB] (257 MBps) [2024-12-15T05:15:44.620Z] Copying: 1024/1024 [MB] (average 244 MBps) 00:25:24.480 00:25:24.480 /home/vagrant/spdk_repo/spdk/test/ftl/dirty_shutdown.sh: line 87: 93129 Killed "$SPDK_BIN_DIR/spdk_tgt" -m 0x1 00:25:24.480 05:15:44 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@88 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/ftl/testfile2 --ob=ftl0 --count=262144 --seek=262144 --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:25:24.480 [2024-12-15 05:15:44.516593] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:25:24.480 [2024-12-15 05:15:44.516717] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid93876 ] 00:25:24.741 [2024-12-15 05:15:44.669966] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:24.741 [2024-12-15 05:15:44.689839] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:25:24.741 [2024-12-15 05:15:44.773499] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:25:24.741 [2024-12-15 05:15:44.773548] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:25:24.741 [2024-12-15 05:15:44.835198] blobstore.c:4899:bs_recover: *NOTICE*: Performing recovery on blobstore 00:25:24.741 [2024-12-15 05:15:44.835486] blobstore.c:4846:bs_load_replay_md_cpl: *NOTICE*: Recover: blob 0x0 00:25:24.741 [2024-12-15 05:15:44.835680] blobstore.c:4846:bs_load_replay_md_cpl: *NOTICE*: Recover: blob 0x1 00:25:25.004 [2024-12-15 05:15:45.051394] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.004 [2024-12-15 05:15:45.051425] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:25:25.004 [2024-12-15 05:15:45.051443] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:25:25.004 [2024-12-15 05:15:45.051449] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.004 [2024-12-15 05:15:45.051484] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.004 [2024-12-15 05:15:45.051491] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:25:25.004 [2024-12-15 05:15:45.051498] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.022 ms 00:25:25.004 [2024-12-15 05:15:45.051503] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.004 [2024-12-15 05:15:45.051521] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:25:25.004 [2024-12-15 05:15:45.051703] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:25:25.004 [2024-12-15 05:15:45.051715] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.004 [2024-12-15 05:15:45.051721] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:25:25.004 [2024-12-15 05:15:45.051727] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.198 ms 00:25:25.004 [2024-12-15 05:15:45.051732] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.004 [2024-12-15 05:15:45.052631] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 0, shm_clean 0 00:25:25.004 [2024-12-15 05:15:45.054628] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.004 [2024-12-15 05:15:45.054653] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:25:25.004 [2024-12-15 05:15:45.054660] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.997 ms 00:25:25.004 [2024-12-15 05:15:45.054671] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.004 [2024-12-15 05:15:45.054711] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.004 [2024-12-15 05:15:45.054718] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:25:25.004 [2024-12-15 05:15:45.054726] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.015 ms 00:25:25.004 [2024-12-15 05:15:45.054731] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.004 [2024-12-15 05:15:45.059043] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.004 [2024-12-15 05:15:45.059065] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:25:25.004 [2024-12-15 05:15:45.059072] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.274 ms 00:25:25.004 [2024-12-15 05:15:45.059078] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.004 [2024-12-15 05:15:45.059143] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.004 [2024-12-15 05:15:45.059150] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:25:25.004 [2024-12-15 05:15:45.059157] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.052 ms 00:25:25.004 [2024-12-15 05:15:45.059164] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.004 [2024-12-15 05:15:45.059198] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.004 [2024-12-15 05:15:45.059205] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:25:25.004 [2024-12-15 05:15:45.059215] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:25:25.004 [2024-12-15 05:15:45.059221] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.004 [2024-12-15 05:15:45.059236] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:25:25.004 [2024-12-15 05:15:45.060382] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.004 [2024-12-15 05:15:45.060406] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:25:25.004 [2024-12-15 05:15:45.060412] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.149 ms 00:25:25.004 [2024-12-15 05:15:45.060419] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.004 [2024-12-15 05:15:45.060459] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.004 [2024-12-15 05:15:45.060467] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:25:25.004 [2024-12-15 05:15:45.060473] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.009 ms 00:25:25.004 [2024-12-15 05:15:45.060478] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.004 [2024-12-15 05:15:45.060496] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:25:25.004 [2024-12-15 05:15:45.060512] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:25:25.004 [2024-12-15 05:15:45.060538] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:25:25.004 [2024-12-15 05:15:45.060551] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:25:25.004 [2024-12-15 05:15:45.060628] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:25:25.004 [2024-12-15 05:15:45.060636] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:25:25.004 [2024-12-15 05:15:45.060643] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:25:25.004 [2024-12-15 05:15:45.060653] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:25:25.004 [2024-12-15 05:15:45.060663] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:25:25.004 [2024-12-15 05:15:45.060669] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:25:25.004 [2024-12-15 05:15:45.060674] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:25:25.004 [2024-12-15 05:15:45.060682] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:25:25.004 [2024-12-15 05:15:45.060689] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:25:25.004 [2024-12-15 05:15:45.060697] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.004 [2024-12-15 05:15:45.060702] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:25:25.004 [2024-12-15 05:15:45.060712] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.203 ms 00:25:25.004 [2024-12-15 05:15:45.060717] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.004 [2024-12-15 05:15:45.060781] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.004 [2024-12-15 05:15:45.060787] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:25:25.004 [2024-12-15 05:15:45.060792] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.054 ms 00:25:25.004 [2024-12-15 05:15:45.060801] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.004 [2024-12-15 05:15:45.060873] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:25:25.004 [2024-12-15 05:15:45.060885] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:25:25.004 [2024-12-15 05:15:45.060891] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:25:25.004 [2024-12-15 05:15:45.060897] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:25:25.004 [2024-12-15 05:15:45.060903] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:25:25.004 [2024-12-15 05:15:45.060909] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:25:25.004 [2024-12-15 05:15:45.060915] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:25:25.004 [2024-12-15 05:15:45.060920] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:25:25.004 [2024-12-15 05:15:45.060925] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:25:25.004 [2024-12-15 05:15:45.060930] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:25:25.004 [2024-12-15 05:15:45.060935] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:25:25.004 [2024-12-15 05:15:45.060943] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:25:25.004 [2024-12-15 05:15:45.060948] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:25:25.004 [2024-12-15 05:15:45.060953] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:25:25.004 [2024-12-15 05:15:45.060958] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:25:25.004 [2024-12-15 05:15:45.060964] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:25:25.004 [2024-12-15 05:15:45.060969] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:25:25.004 [2024-12-15 05:15:45.060974] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:25:25.004 [2024-12-15 05:15:45.060979] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:25:25.004 [2024-12-15 05:15:45.060984] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:25:25.004 [2024-12-15 05:15:45.060988] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:25:25.004 [2024-12-15 05:15:45.060993] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:25:25.004 [2024-12-15 05:15:45.060998] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:25:25.004 [2024-12-15 05:15:45.061003] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:25:25.005 [2024-12-15 05:15:45.061008] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:25:25.005 [2024-12-15 05:15:45.061012] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:25:25.005 [2024-12-15 05:15:45.061017] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:25:25.005 [2024-12-15 05:15:45.061027] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:25:25.005 [2024-12-15 05:15:45.061032] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:25:25.005 [2024-12-15 05:15:45.061038] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:25:25.005 [2024-12-15 05:15:45.061043] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:25:25.005 [2024-12-15 05:15:45.061049] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:25:25.005 [2024-12-15 05:15:45.061055] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:25:25.005 [2024-12-15 05:15:45.061061] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:25:25.005 [2024-12-15 05:15:45.061066] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:25:25.005 [2024-12-15 05:15:45.061072] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:25:25.005 [2024-12-15 05:15:45.061078] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:25:25.005 [2024-12-15 05:15:45.061085] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:25:25.005 [2024-12-15 05:15:45.061090] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:25:25.005 [2024-12-15 05:15:45.061096] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:25:25.005 [2024-12-15 05:15:45.061102] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:25:25.005 [2024-12-15 05:15:45.061107] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:25:25.005 [2024-12-15 05:15:45.061113] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:25:25.005 [2024-12-15 05:15:45.061120] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:25:25.005 [2024-12-15 05:15:45.061127] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:25:25.005 [2024-12-15 05:15:45.061133] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:25:25.005 [2024-12-15 05:15:45.061139] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:25:25.005 [2024-12-15 05:15:45.061148] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:25:25.005 [2024-12-15 05:15:45.061154] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:25:25.005 [2024-12-15 05:15:45.061159] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:25:25.005 [2024-12-15 05:15:45.061165] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:25:25.005 [2024-12-15 05:15:45.061171] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:25:25.005 [2024-12-15 05:15:45.061176] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:25:25.005 [2024-12-15 05:15:45.061183] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:25:25.005 [2024-12-15 05:15:45.061191] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:25:25.005 [2024-12-15 05:15:45.061198] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:25:25.005 [2024-12-15 05:15:45.061204] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:25:25.005 [2024-12-15 05:15:45.061210] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:25:25.005 [2024-12-15 05:15:45.061216] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:25:25.005 [2024-12-15 05:15:45.061224] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:25:25.005 [2024-12-15 05:15:45.061230] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:25:25.005 [2024-12-15 05:15:45.061236] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:25:25.005 [2024-12-15 05:15:45.061242] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:25:25.005 [2024-12-15 05:15:45.061248] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:25:25.005 [2024-12-15 05:15:45.061254] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:25:25.005 [2024-12-15 05:15:45.061260] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:25:25.005 [2024-12-15 05:15:45.061266] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:25:25.005 [2024-12-15 05:15:45.061272] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:25:25.005 [2024-12-15 05:15:45.061278] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:25:25.005 [2024-12-15 05:15:45.061285] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:25:25.005 [2024-12-15 05:15:45.061294] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:25:25.005 [2024-12-15 05:15:45.061300] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:25:25.005 [2024-12-15 05:15:45.061307] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:25:25.005 [2024-12-15 05:15:45.061313] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:25:25.005 [2024-12-15 05:15:45.061319] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:25:25.005 [2024-12-15 05:15:45.061327] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.005 [2024-12-15 05:15:45.061334] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:25:25.005 [2024-12-15 05:15:45.061340] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.505 ms 00:25:25.005 [2024-12-15 05:15:45.061347] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.005 [2024-12-15 05:15:45.069090] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.005 [2024-12-15 05:15:45.069119] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:25:25.005 [2024-12-15 05:15:45.069127] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 7.713 ms 00:25:25.005 [2024-12-15 05:15:45.069133] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.005 [2024-12-15 05:15:45.069196] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.005 [2024-12-15 05:15:45.069202] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:25:25.005 [2024-12-15 05:15:45.069208] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.046 ms 00:25:25.005 [2024-12-15 05:15:45.069217] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.005 [2024-12-15 05:15:45.086728] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.005 [2024-12-15 05:15:45.086769] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:25:25.005 [2024-12-15 05:15:45.086783] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 17.473 ms 00:25:25.005 [2024-12-15 05:15:45.086798] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.005 [2024-12-15 05:15:45.086847] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.005 [2024-12-15 05:15:45.086859] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:25:25.005 [2024-12-15 05:15:45.086870] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:25:25.005 [2024-12-15 05:15:45.086878] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.005 [2024-12-15 05:15:45.087255] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.005 [2024-12-15 05:15:45.087287] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:25:25.005 [2024-12-15 05:15:45.087298] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.311 ms 00:25:25.005 [2024-12-15 05:15:45.087308] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.005 [2024-12-15 05:15:45.087489] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.005 [2024-12-15 05:15:45.087515] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:25:25.005 [2024-12-15 05:15:45.087527] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.149 ms 00:25:25.005 [2024-12-15 05:15:45.087538] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.005 [2024-12-15 05:15:45.092921] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.005 [2024-12-15 05:15:45.092949] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:25:25.005 [2024-12-15 05:15:45.092958] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.361 ms 00:25:25.005 [2024-12-15 05:15:45.092965] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.005 [2024-12-15 05:15:45.095212] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 2, empty chunks = 2 00:25:25.005 [2024-12-15 05:15:45.095246] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:25:25.005 [2024-12-15 05:15:45.095257] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.005 [2024-12-15 05:15:45.095264] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:25:25.005 [2024-12-15 05:15:45.095272] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.217 ms 00:25:25.005 [2024-12-15 05:15:45.095279] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.005 [2024-12-15 05:15:45.108412] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.005 [2024-12-15 05:15:45.108444] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:25:25.005 [2024-12-15 05:15:45.108458] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 13.099 ms 00:25:25.005 [2024-12-15 05:15:45.108465] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.005 [2024-12-15 05:15:45.109951] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.005 [2024-12-15 05:15:45.109974] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:25:25.005 [2024-12-15 05:15:45.109981] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.457 ms 00:25:25.005 [2024-12-15 05:15:45.109987] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.005 [2024-12-15 05:15:45.111181] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.005 [2024-12-15 05:15:45.111205] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:25:25.005 [2024-12-15 05:15:45.111213] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.169 ms 00:25:25.005 [2024-12-15 05:15:45.111219] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.005 [2024-12-15 05:15:45.111474] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.006 [2024-12-15 05:15:45.111486] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:25:25.006 [2024-12-15 05:15:45.111493] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.206 ms 00:25:25.006 [2024-12-15 05:15:45.111499] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.006 [2024-12-15 05:15:45.125540] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.006 [2024-12-15 05:15:45.125570] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:25:25.006 [2024-12-15 05:15:45.125578] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 14.030 ms 00:25:25.006 [2024-12-15 05:15:45.125585] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.006 [2024-12-15 05:15:45.131295] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 9 (of 10) MiB 00:25:25.006 [2024-12-15 05:15:45.133098] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.006 [2024-12-15 05:15:45.133119] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:25:25.006 [2024-12-15 05:15:45.133126] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 7.483 ms 00:25:25.006 [2024-12-15 05:15:45.133131] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.006 [2024-12-15 05:15:45.133169] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.006 [2024-12-15 05:15:45.133179] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:25:25.006 [2024-12-15 05:15:45.133187] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.008 ms 00:25:25.006 [2024-12-15 05:15:45.133193] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.006 [2024-12-15 05:15:45.133243] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.006 [2024-12-15 05:15:45.133251] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:25:25.006 [2024-12-15 05:15:45.133257] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.020 ms 00:25:25.006 [2024-12-15 05:15:45.133263] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.006 [2024-12-15 05:15:45.133277] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.006 [2024-12-15 05:15:45.133283] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:25:25.006 [2024-12-15 05:15:45.133288] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:25:25.006 [2024-12-15 05:15:45.133296] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.006 [2024-12-15 05:15:45.133319] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:25:25.006 [2024-12-15 05:15:45.133327] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.006 [2024-12-15 05:15:45.133336] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:25:25.006 [2024-12-15 05:15:45.133343] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.008 ms 00:25:25.006 [2024-12-15 05:15:45.133349] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.006 [2024-12-15 05:15:45.136024] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.006 [2024-12-15 05:15:45.136049] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:25:25.006 [2024-12-15 05:15:45.136056] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.663 ms 00:25:25.006 [2024-12-15 05:15:45.136062] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.006 [2024-12-15 05:15:45.136116] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:25:25.006 [2024-12-15 05:15:45.136123] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:25:25.006 [2024-12-15 05:15:45.136130] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.026 ms 00:25:25.006 [2024-12-15 05:15:45.136135] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:25:25.006 [2024-12-15 05:15:45.137007] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 85.309 ms, result 0 00:25:26.391  [2024-12-15T05:15:47.474Z] Copying: 20/1024 [MB] (20 MBps) [2024-12-15T05:15:48.476Z] Copying: 36/1024 [MB] (15 MBps) [2024-12-15T05:15:49.420Z] Copying: 57/1024 [MB] (21 MBps) [2024-12-15T05:15:50.365Z] Copying: 73/1024 [MB] (15 MBps) [2024-12-15T05:15:51.306Z] Copying: 91/1024 [MB] (17 MBps) [2024-12-15T05:15:52.246Z] Copying: 114/1024 [MB] (22 MBps) [2024-12-15T05:15:53.190Z] Copying: 152/1024 [MB] (37 MBps) [2024-12-15T05:15:54.577Z] Copying: 171/1024 [MB] (19 MBps) [2024-12-15T05:15:55.149Z] Copying: 206/1024 [MB] (35 MBps) [2024-12-15T05:15:56.531Z] Copying: 222/1024 [MB] (15 MBps) [2024-12-15T05:15:57.474Z] Copying: 251/1024 [MB] (28 MBps) [2024-12-15T05:15:58.417Z] Copying: 262/1024 [MB] (10 MBps) [2024-12-15T05:15:59.362Z] Copying: 281/1024 [MB] (19 MBps) [2024-12-15T05:16:00.303Z] Copying: 298/1024 [MB] (17 MBps) [2024-12-15T05:16:01.246Z] Copying: 317/1024 [MB] (18 MBps) [2024-12-15T05:16:02.189Z] Copying: 356/1024 [MB] (38 MBps) [2024-12-15T05:16:03.576Z] Copying: 394/1024 [MB] (38 MBps) [2024-12-15T05:16:04.149Z] Copying: 416/1024 [MB] (21 MBps) [2024-12-15T05:16:05.535Z] Copying: 436/1024 [MB] (20 MBps) [2024-12-15T05:16:06.477Z] Copying: 459/1024 [MB] (23 MBps) [2024-12-15T05:16:07.420Z] Copying: 475/1024 [MB] (16 MBps) [2024-12-15T05:16:08.362Z] Copying: 489/1024 [MB] (13 MBps) [2024-12-15T05:16:09.304Z] Copying: 501/1024 [MB] (11 MBps) [2024-12-15T05:16:10.249Z] Copying: 518/1024 [MB] (16 MBps) [2024-12-15T05:16:11.198Z] Copying: 530/1024 [MB] (12 MBps) [2024-12-15T05:16:12.196Z] Copying: 545/1024 [MB] (15 MBps) [2024-12-15T05:16:13.586Z] Copying: 563/1024 [MB] (17 MBps) [2024-12-15T05:16:14.158Z] Copying: 575/1024 [MB] (12 MBps) [2024-12-15T05:16:15.546Z] Copying: 588/1024 [MB] (13 MBps) [2024-12-15T05:16:16.491Z] Copying: 601/1024 [MB] (13 MBps) [2024-12-15T05:16:17.435Z] Copying: 622/1024 [MB] (20 MBps) [2024-12-15T05:16:18.379Z] Copying: 650/1024 [MB] (27 MBps) [2024-12-15T05:16:19.324Z] Copying: 661/1024 [MB] (11 MBps) [2024-12-15T05:16:20.269Z] Copying: 673/1024 [MB] (12 MBps) [2024-12-15T05:16:21.212Z] Copying: 687/1024 [MB] (13 MBps) [2024-12-15T05:16:22.157Z] Copying: 716/1024 [MB] (28 MBps) [2024-12-15T05:16:23.544Z] Copying: 733/1024 [MB] (16 MBps) [2024-12-15T05:16:24.486Z] Copying: 751/1024 [MB] (18 MBps) [2024-12-15T05:16:25.429Z] Copying: 764/1024 [MB] (13 MBps) [2024-12-15T05:16:26.373Z] Copying: 781/1024 [MB] (16 MBps) [2024-12-15T05:16:27.317Z] Copying: 799/1024 [MB] (18 MBps) [2024-12-15T05:16:28.262Z] Copying: 813/1024 [MB] (13 MBps) [2024-12-15T05:16:29.207Z] Copying: 834/1024 [MB] (20 MBps) [2024-12-15T05:16:30.150Z] Copying: 855/1024 [MB] (20 MBps) [2024-12-15T05:16:31.554Z] Copying: 876/1024 [MB] (20 MBps) [2024-12-15T05:16:32.498Z] Copying: 894/1024 [MB] (18 MBps) [2024-12-15T05:16:33.458Z] Copying: 916/1024 [MB] (22 MBps) [2024-12-15T05:16:34.403Z] Copying: 929/1024 [MB] (13 MBps) [2024-12-15T05:16:35.412Z] Copying: 945/1024 [MB] (15 MBps) [2024-12-15T05:16:36.355Z] Copying: 961/1024 [MB] (16 MBps) [2024-12-15T05:16:37.299Z] Copying: 981/1024 [MB] (20 MBps) [2024-12-15T05:16:38.244Z] Copying: 1010/1024 [MB] (28 MBps) [2024-12-15T05:16:38.815Z] Copying: 1023/1024 [MB] (13 MBps) [2024-12-15T05:16:38.815Z] Copying: 1024/1024 [MB] (average 19 MBps)[2024-12-15 05:16:38.687001] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.675 [2024-12-15 05:16:38.687078] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:26:18.675 [2024-12-15 05:16:38.687098] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:26:18.675 [2024-12-15 05:16:38.687107] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.675 [2024-12-15 05:16:38.689687] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:26:18.675 [2024-12-15 05:16:38.693839] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.675 [2024-12-15 05:16:38.693904] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:26:18.675 [2024-12-15 05:16:38.693916] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.075 ms 00:26:18.675 [2024-12-15 05:16:38.693925] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.676 [2024-12-15 05:16:38.704488] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.676 [2024-12-15 05:16:38.704540] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:26:18.676 [2024-12-15 05:16:38.704552] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 9.822 ms 00:26:18.676 [2024-12-15 05:16:38.704560] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.676 [2024-12-15 05:16:38.728559] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.676 [2024-12-15 05:16:38.728624] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:26:18.676 [2024-12-15 05:16:38.728637] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 23.972 ms 00:26:18.676 [2024-12-15 05:16:38.728645] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.676 [2024-12-15 05:16:38.734733] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.676 [2024-12-15 05:16:38.734782] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:26:18.676 [2024-12-15 05:16:38.734794] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.059 ms 00:26:18.676 [2024-12-15 05:16:38.734811] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.676 [2024-12-15 05:16:38.737650] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.676 [2024-12-15 05:16:38.737719] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:26:18.676 [2024-12-15 05:16:38.737730] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.781 ms 00:26:18.676 [2024-12-15 05:16:38.737737] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.676 [2024-12-15 05:16:38.743024] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.676 [2024-12-15 05:16:38.743081] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:26:18.676 [2024-12-15 05:16:38.743105] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.242 ms 00:26:18.676 [2024-12-15 05:16:38.743113] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.937 [2024-12-15 05:16:38.924168] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.937 [2024-12-15 05:16:38.924249] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:26:18.937 [2024-12-15 05:16:38.924264] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 181.006 ms 00:26:18.937 [2024-12-15 05:16:38.924284] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.937 [2024-12-15 05:16:38.927891] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.937 [2024-12-15 05:16:38.927942] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:26:18.937 [2024-12-15 05:16:38.927953] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.589 ms 00:26:18.937 [2024-12-15 05:16:38.927960] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.937 [2024-12-15 05:16:38.930915] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.937 [2024-12-15 05:16:38.930966] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:26:18.937 [2024-12-15 05:16:38.930976] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.913 ms 00:26:18.937 [2024-12-15 05:16:38.930984] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.937 [2024-12-15 05:16:38.933557] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.937 [2024-12-15 05:16:38.933611] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:26:18.937 [2024-12-15 05:16:38.933620] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.531 ms 00:26:18.937 [2024-12-15 05:16:38.933627] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.937 [2024-12-15 05:16:38.936049] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.937 [2024-12-15 05:16:38.936099] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:26:18.937 [2024-12-15 05:16:38.936110] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.351 ms 00:26:18.937 [2024-12-15 05:16:38.936118] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.937 [2024-12-15 05:16:38.936159] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:26:18.937 [2024-12-15 05:16:38.936188] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 98560 / 261120 wr_cnt: 1 state: open 00:26:18.937 [2024-12-15 05:16:38.936206] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936215] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936238] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936248] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936257] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936266] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936275] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936284] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936293] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936301] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936310] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936320] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936329] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936339] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936348] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936357] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936365] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936375] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:26:18.937 [2024-12-15 05:16:38.936383] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936392] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936401] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936409] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936418] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936426] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936451] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936459] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936467] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936475] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936485] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936493] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936501] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936509] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936517] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936524] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936532] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936540] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936548] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936556] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936563] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936572] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936579] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936587] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936594] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936602] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936610] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936617] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936625] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936632] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936640] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936650] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936658] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936666] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936674] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936682] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936690] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936697] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936704] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936712] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936719] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936727] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936734] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936741] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936749] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936755] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936763] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936771] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936777] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936786] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936794] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936801] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936809] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936816] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936824] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936832] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936839] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936848] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936855] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936863] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936870] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936877] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936885] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936893] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936901] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936910] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936918] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936926] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936934] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936941] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936949] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936957] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936965] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936972] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936979] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936987] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.936994] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.937001] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.937009] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.937016] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.937023] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:26:18.938 [2024-12-15 05:16:38.937040] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:26:18.938 [2024-12-15 05:16:38.937048] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: a3f0cc82-b3b1-4076-8939-aceeb9079b03 00:26:18.938 [2024-12-15 05:16:38.937057] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 98560 00:26:18.938 [2024-12-15 05:16:38.937065] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 99520 00:26:18.938 [2024-12-15 05:16:38.937072] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 98560 00:26:18.938 [2024-12-15 05:16:38.937088] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: 1.0097 00:26:18.938 [2024-12-15 05:16:38.937097] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:26:18.938 [2024-12-15 05:16:38.937105] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:26:18.938 [2024-12-15 05:16:38.937132] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:26:18.938 [2024-12-15 05:16:38.937139] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:26:18.938 [2024-12-15 05:16:38.937146] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:26:18.938 [2024-12-15 05:16:38.937154] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.938 [2024-12-15 05:16:38.937163] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:26:18.938 [2024-12-15 05:16:38.937174] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.996 ms 00:26:18.939 [2024-12-15 05:16:38.937183] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.939560] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.939 [2024-12-15 05:16:38.939596] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:26:18.939 [2024-12-15 05:16:38.939606] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.357 ms 00:26:18.939 [2024-12-15 05:16:38.939614] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.939735] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:18.939 [2024-12-15 05:16:38.939744] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:26:18.939 [2024-12-15 05:16:38.939758] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.099 ms 00:26:18.939 [2024-12-15 05:16:38.939765] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.947309] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:26:18.939 [2024-12-15 05:16:38.947367] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:26:18.939 [2024-12-15 05:16:38.947379] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:26:18.939 [2024-12-15 05:16:38.947387] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.947464] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:26:18.939 [2024-12-15 05:16:38.947473] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:26:18.939 [2024-12-15 05:16:38.947481] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:26:18.939 [2024-12-15 05:16:38.947490] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.947557] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:26:18.939 [2024-12-15 05:16:38.947570] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:26:18.939 [2024-12-15 05:16:38.947578] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:26:18.939 [2024-12-15 05:16:38.947586] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.947600] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:26:18.939 [2024-12-15 05:16:38.947616] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:26:18.939 [2024-12-15 05:16:38.947624] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:26:18.939 [2024-12-15 05:16:38.947631] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.960972] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:26:18.939 [2024-12-15 05:16:38.961030] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:26:18.939 [2024-12-15 05:16:38.961041] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:26:18.939 [2024-12-15 05:16:38.961054] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.971115] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:26:18.939 [2024-12-15 05:16:38.971165] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:26:18.939 [2024-12-15 05:16:38.971176] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:26:18.939 [2024-12-15 05:16:38.971184] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.971253] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:26:18.939 [2024-12-15 05:16:38.971262] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:26:18.939 [2024-12-15 05:16:38.971271] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:26:18.939 [2024-12-15 05:16:38.971280] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.971325] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:26:18.939 [2024-12-15 05:16:38.971337] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:26:18.939 [2024-12-15 05:16:38.971349] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:26:18.939 [2024-12-15 05:16:38.971361] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.971449] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:26:18.939 [2024-12-15 05:16:38.971460] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:26:18.939 [2024-12-15 05:16:38.971469] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:26:18.939 [2024-12-15 05:16:38.971477] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.971507] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:26:18.939 [2024-12-15 05:16:38.971517] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:26:18.939 [2024-12-15 05:16:38.971525] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:26:18.939 [2024-12-15 05:16:38.971537] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.971576] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:26:18.939 [2024-12-15 05:16:38.971605] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:26:18.939 [2024-12-15 05:16:38.971614] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:26:18.939 [2024-12-15 05:16:38.971622] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.971671] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:26:18.939 [2024-12-15 05:16:38.971689] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:26:18.939 [2024-12-15 05:16:38.971700] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:26:18.939 [2024-12-15 05:16:38.971708] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:18.939 [2024-12-15 05:16:38.971832] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 286.646 ms, result 0 00:26:19.511 00:26:19.511 00:26:19.511 05:16:39 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@90 -- # md5sum /home/vagrant/spdk_repo/spdk/test/ftl/testfile2 00:26:22.056 05:16:41 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@93 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=ftl0 --of=/home/vagrant/spdk_repo/spdk/test/ftl/testfile --count=262144 --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:26:22.056 [2024-12-15 05:16:41.872537] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:26:22.056 [2024-12-15 05:16:41.872701] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid94463 ] 00:26:22.056 [2024-12-15 05:16:42.039324] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:22.056 [2024-12-15 05:16:42.068184] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:26:22.056 [2024-12-15 05:16:42.179030] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:26:22.056 [2024-12-15 05:16:42.179109] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:26:22.318 [2024-12-15 05:16:42.341320] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.318 [2024-12-15 05:16:42.341379] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:26:22.318 [2024-12-15 05:16:42.341393] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.006 ms 00:26:22.318 [2024-12-15 05:16:42.341402] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.318 [2024-12-15 05:16:42.341479] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.318 [2024-12-15 05:16:42.341491] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:26:22.318 [2024-12-15 05:16:42.341501] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.053 ms 00:26:22.318 [2024-12-15 05:16:42.341518] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.318 [2024-12-15 05:16:42.341543] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:26:22.318 [2024-12-15 05:16:42.342348] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:26:22.318 [2024-12-15 05:16:42.342406] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.318 [2024-12-15 05:16:42.342421] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:26:22.318 [2024-12-15 05:16:42.342454] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.868 ms 00:26:22.318 [2024-12-15 05:16:42.342464] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.318 [2024-12-15 05:16:42.344147] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 0, shm_clean 0 00:26:22.318 [2024-12-15 05:16:42.348111] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.318 [2024-12-15 05:16:42.348164] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:26:22.318 [2024-12-15 05:16:42.348183] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.967 ms 00:26:22.318 [2024-12-15 05:16:42.348194] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.318 [2024-12-15 05:16:42.348280] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.318 [2024-12-15 05:16:42.348299] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:26:22.318 [2024-12-15 05:16:42.348309] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.030 ms 00:26:22.318 [2024-12-15 05:16:42.348316] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.318 [2024-12-15 05:16:42.356521] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.318 [2024-12-15 05:16:42.356566] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:26:22.318 [2024-12-15 05:16:42.356581] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.161 ms 00:26:22.318 [2024-12-15 05:16:42.356589] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.318 [2024-12-15 05:16:42.356695] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.318 [2024-12-15 05:16:42.356705] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:26:22.318 [2024-12-15 05:16:42.356715] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.078 ms 00:26:22.318 [2024-12-15 05:16:42.356723] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.318 [2024-12-15 05:16:42.356784] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.318 [2024-12-15 05:16:42.356795] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:26:22.318 [2024-12-15 05:16:42.356804] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.008 ms 00:26:22.318 [2024-12-15 05:16:42.356815] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.318 [2024-12-15 05:16:42.356846] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:26:22.318 [2024-12-15 05:16:42.358884] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.318 [2024-12-15 05:16:42.358928] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:26:22.318 [2024-12-15 05:16:42.358939] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.046 ms 00:26:22.318 [2024-12-15 05:16:42.358947] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.318 [2024-12-15 05:16:42.358992] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.318 [2024-12-15 05:16:42.359004] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:26:22.318 [2024-12-15 05:16:42.359012] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.016 ms 00:26:22.318 [2024-12-15 05:16:42.359027] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.318 [2024-12-15 05:16:42.359053] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:26:22.318 [2024-12-15 05:16:42.359076] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:26:22.318 [2024-12-15 05:16:42.359113] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:26:22.318 [2024-12-15 05:16:42.359129] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:26:22.318 [2024-12-15 05:16:42.359240] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:26:22.318 [2024-12-15 05:16:42.359250] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:26:22.318 [2024-12-15 05:16:42.359264] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:26:22.318 [2024-12-15 05:16:42.359275] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:26:22.318 [2024-12-15 05:16:42.359284] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:26:22.318 [2024-12-15 05:16:42.359294] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:26:22.318 [2024-12-15 05:16:42.359302] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:26:22.318 [2024-12-15 05:16:42.359313] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:26:22.318 [2024-12-15 05:16:42.359321] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:26:22.318 [2024-12-15 05:16:42.359329] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.318 [2024-12-15 05:16:42.359340] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:26:22.318 [2024-12-15 05:16:42.359348] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.282 ms 00:26:22.318 [2024-12-15 05:16:42.359358] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.318 [2024-12-15 05:16:42.359469] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.318 [2024-12-15 05:16:42.359480] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:26:22.318 [2024-12-15 05:16:42.359488] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.094 ms 00:26:22.318 [2024-12-15 05:16:42.359495] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.318 [2024-12-15 05:16:42.359596] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:26:22.318 [2024-12-15 05:16:42.359608] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:26:22.318 [2024-12-15 05:16:42.359617] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:26:22.318 [2024-12-15 05:16:42.359635] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:26:22.318 [2024-12-15 05:16:42.359649] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:26:22.318 [2024-12-15 05:16:42.359657] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:26:22.318 [2024-12-15 05:16:42.359665] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:26:22.318 [2024-12-15 05:16:42.359674] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:26:22.318 [2024-12-15 05:16:42.359682] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:26:22.318 [2024-12-15 05:16:42.359690] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:26:22.318 [2024-12-15 05:16:42.359698] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:26:22.318 [2024-12-15 05:16:42.359712] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:26:22.319 [2024-12-15 05:16:42.359724] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:26:22.319 [2024-12-15 05:16:42.359732] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:26:22.319 [2024-12-15 05:16:42.359741] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:26:22.319 [2024-12-15 05:16:42.359749] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:26:22.319 [2024-12-15 05:16:42.359757] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:26:22.319 [2024-12-15 05:16:42.359765] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:26:22.319 [2024-12-15 05:16:42.359773] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:26:22.319 [2024-12-15 05:16:42.359781] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:26:22.319 [2024-12-15 05:16:42.359789] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:26:22.319 [2024-12-15 05:16:42.359798] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:26:22.319 [2024-12-15 05:16:42.359805] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:26:22.319 [2024-12-15 05:16:42.359813] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:26:22.319 [2024-12-15 05:16:42.359821] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:26:22.319 [2024-12-15 05:16:42.359829] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:26:22.319 [2024-12-15 05:16:42.359836] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:26:22.319 [2024-12-15 05:16:42.359847] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:26:22.319 [2024-12-15 05:16:42.359854] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:26:22.319 [2024-12-15 05:16:42.359862] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:26:22.319 [2024-12-15 05:16:42.359869] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:26:22.319 [2024-12-15 05:16:42.359876] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:26:22.319 [2024-12-15 05:16:42.359884] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:26:22.319 [2024-12-15 05:16:42.359892] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:26:22.319 [2024-12-15 05:16:42.359900] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:26:22.319 [2024-12-15 05:16:42.359907] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:26:22.319 [2024-12-15 05:16:42.359915] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:26:22.319 [2024-12-15 05:16:42.359921] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:26:22.319 [2024-12-15 05:16:42.359928] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:26:22.319 [2024-12-15 05:16:42.359935] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:26:22.319 [2024-12-15 05:16:42.359942] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:26:22.319 [2024-12-15 05:16:42.359948] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:26:22.319 [2024-12-15 05:16:42.359955] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:26:22.319 [2024-12-15 05:16:42.359965] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:26:22.319 [2024-12-15 05:16:42.359978] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:26:22.319 [2024-12-15 05:16:42.359986] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:26:22.319 [2024-12-15 05:16:42.359998] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:26:22.319 [2024-12-15 05:16:42.360009] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:26:22.319 [2024-12-15 05:16:42.360016] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:26:22.319 [2024-12-15 05:16:42.360023] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:26:22.319 [2024-12-15 05:16:42.360030] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:26:22.319 [2024-12-15 05:16:42.360037] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:26:22.319 [2024-12-15 05:16:42.360043] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:26:22.319 [2024-12-15 05:16:42.360052] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:26:22.319 [2024-12-15 05:16:42.360061] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:26:22.319 [2024-12-15 05:16:42.360071] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:26:22.319 [2024-12-15 05:16:42.360080] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:26:22.319 [2024-12-15 05:16:42.360087] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:26:22.319 [2024-12-15 05:16:42.360094] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:26:22.319 [2024-12-15 05:16:42.360104] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:26:22.319 [2024-12-15 05:16:42.360110] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:26:22.319 [2024-12-15 05:16:42.360118] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:26:22.319 [2024-12-15 05:16:42.360125] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:26:22.319 [2024-12-15 05:16:42.360132] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:26:22.319 [2024-12-15 05:16:42.360139] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:26:22.319 [2024-12-15 05:16:42.360146] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:26:22.319 [2024-12-15 05:16:42.360153] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:26:22.319 [2024-12-15 05:16:42.360160] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:26:22.319 [2024-12-15 05:16:42.360168] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:26:22.319 [2024-12-15 05:16:42.360175] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:26:22.319 [2024-12-15 05:16:42.360183] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:26:22.319 [2024-12-15 05:16:42.360193] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:26:22.319 [2024-12-15 05:16:42.360200] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:26:22.319 [2024-12-15 05:16:42.360207] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:26:22.319 [2024-12-15 05:16:42.360215] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:26:22.319 [2024-12-15 05:16:42.360240] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.319 [2024-12-15 05:16:42.360248] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:26:22.319 [2024-12-15 05:16:42.360256] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.712 ms 00:26:22.319 [2024-12-15 05:16:42.360270] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.319 [2024-12-15 05:16:42.374408] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.319 [2024-12-15 05:16:42.374481] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:26:22.319 [2024-12-15 05:16:42.374493] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 14.091 ms 00:26:22.319 [2024-12-15 05:16:42.374508] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.319 [2024-12-15 05:16:42.374598] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.319 [2024-12-15 05:16:42.374607] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:26:22.319 [2024-12-15 05:16:42.374616] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.067 ms 00:26:22.319 [2024-12-15 05:16:42.374625] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.319 [2024-12-15 05:16:42.396534] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.319 [2024-12-15 05:16:42.396774] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:26:22.319 [2024-12-15 05:16:42.396803] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 21.848 ms 00:26:22.319 [2024-12-15 05:16:42.396827] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.319 [2024-12-15 05:16:42.396892] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.319 [2024-12-15 05:16:42.396907] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:26:22.319 [2024-12-15 05:16:42.396921] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:26:22.319 [2024-12-15 05:16:42.396933] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.319 [2024-12-15 05:16:42.397553] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.319 [2024-12-15 05:16:42.397603] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:26:22.319 [2024-12-15 05:16:42.397620] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.520 ms 00:26:22.319 [2024-12-15 05:16:42.397634] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.319 [2024-12-15 05:16:42.397853] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.319 [2024-12-15 05:16:42.397868] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:26:22.319 [2024-12-15 05:16:42.397881] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.182 ms 00:26:22.319 [2024-12-15 05:16:42.397893] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.320 [2024-12-15 05:16:42.406128] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.320 [2024-12-15 05:16:42.406175] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:26:22.320 [2024-12-15 05:16:42.406185] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.206 ms 00:26:22.320 [2024-12-15 05:16:42.406193] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.320 [2024-12-15 05:16:42.409997] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 4, empty chunks = 0 00:26:22.320 [2024-12-15 05:16:42.410048] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:26:22.320 [2024-12-15 05:16:42.410064] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.320 [2024-12-15 05:16:42.410072] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:26:22.320 [2024-12-15 05:16:42.410080] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.768 ms 00:26:22.320 [2024-12-15 05:16:42.410087] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.320 [2024-12-15 05:16:42.425781] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.320 [2024-12-15 05:16:42.425830] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:26:22.320 [2024-12-15 05:16:42.425850] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 15.633 ms 00:26:22.320 [2024-12-15 05:16:42.425858] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.320 [2024-12-15 05:16:42.428897] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.320 [2024-12-15 05:16:42.428946] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:26:22.320 [2024-12-15 05:16:42.428957] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.984 ms 00:26:22.320 [2024-12-15 05:16:42.428964] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.320 [2024-12-15 05:16:42.431801] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.320 [2024-12-15 05:16:42.431978] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:26:22.320 [2024-12-15 05:16:42.431998] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.791 ms 00:26:22.320 [2024-12-15 05:16:42.432006] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.320 [2024-12-15 05:16:42.432365] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.320 [2024-12-15 05:16:42.432381] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:26:22.320 [2024-12-15 05:16:42.432391] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.287 ms 00:26:22.320 [2024-12-15 05:16:42.432405] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.580 [2024-12-15 05:16:42.457169] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.580 [2024-12-15 05:16:42.457231] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:26:22.580 [2024-12-15 05:16:42.457247] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 24.737 ms 00:26:22.580 [2024-12-15 05:16:42.457255] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.580 [2024-12-15 05:16:42.465419] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 9 (of 10) MiB 00:26:22.580 [2024-12-15 05:16:42.468243] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.580 [2024-12-15 05:16:42.468282] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:26:22.580 [2024-12-15 05:16:42.468293] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 10.937 ms 00:26:22.580 [2024-12-15 05:16:42.468307] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.580 [2024-12-15 05:16:42.468376] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.580 [2024-12-15 05:16:42.468388] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:26:22.580 [2024-12-15 05:16:42.468396] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.012 ms 00:26:22.580 [2024-12-15 05:16:42.468404] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.580 [2024-12-15 05:16:42.470029] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.580 [2024-12-15 05:16:42.470076] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:26:22.580 [2024-12-15 05:16:42.470087] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.565 ms 00:26:22.580 [2024-12-15 05:16:42.470094] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.580 [2024-12-15 05:16:42.470119] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.580 [2024-12-15 05:16:42.470128] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:26:22.580 [2024-12-15 05:16:42.470141] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:26:22.580 [2024-12-15 05:16:42.470149] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.580 [2024-12-15 05:16:42.470185] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:26:22.580 [2024-12-15 05:16:42.470196] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.580 [2024-12-15 05:16:42.470204] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:26:22.580 [2024-12-15 05:16:42.470215] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.012 ms 00:26:22.580 [2024-12-15 05:16:42.470223] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.580 [2024-12-15 05:16:42.475427] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.580 [2024-12-15 05:16:42.475490] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:26:22.580 [2024-12-15 05:16:42.475501] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.187 ms 00:26:22.580 [2024-12-15 05:16:42.475509] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.580 [2024-12-15 05:16:42.475583] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:26:22.580 [2024-12-15 05:16:42.475600] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:26:22.580 [2024-12-15 05:16:42.475609] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.035 ms 00:26:22.581 [2024-12-15 05:16:42.475620] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:26:22.581 [2024-12-15 05:16:42.476719] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 134.907 ms, result 0 00:26:23.524  [2024-12-15T05:16:45.052Z] Copying: 1040/1048576 [kB] (1040 kBps) [2024-12-15T05:16:45.997Z] Copying: 4528/1048576 [kB] (3488 kBps) [2024-12-15T05:16:46.943Z] Copying: 19/1024 [MB] (15 MBps) [2024-12-15T05:16:47.888Z] Copying: 54/1024 [MB] (35 MBps) [2024-12-15T05:16:48.834Z] Copying: 74/1024 [MB] (19 MBps) [2024-12-15T05:16:49.780Z] Copying: 103/1024 [MB] (28 MBps) [2024-12-15T05:16:50.724Z] Copying: 135/1024 [MB] (32 MBps) [2024-12-15T05:16:51.669Z] Copying: 157/1024 [MB] (21 MBps) [2024-12-15T05:16:53.057Z] Copying: 200/1024 [MB] (43 MBps) [2024-12-15T05:16:54.003Z] Copying: 233/1024 [MB] (33 MBps) [2024-12-15T05:16:54.947Z] Copying: 277/1024 [MB] (43 MBps) [2024-12-15T05:16:55.893Z] Copying: 305/1024 [MB] (27 MBps) [2024-12-15T05:16:56.838Z] Copying: 328/1024 [MB] (22 MBps) [2024-12-15T05:16:57.860Z] Copying: 344/1024 [MB] (16 MBps) [2024-12-15T05:16:58.813Z] Copying: 368/1024 [MB] (23 MBps) [2024-12-15T05:16:59.759Z] Copying: 385/1024 [MB] (16 MBps) [2024-12-15T05:17:00.704Z] Copying: 401/1024 [MB] (16 MBps) [2024-12-15T05:17:02.092Z] Copying: 417/1024 [MB] (16 MBps) [2024-12-15T05:17:02.664Z] Copying: 438/1024 [MB] (21 MBps) [2024-12-15T05:17:04.052Z] Copying: 460/1024 [MB] (22 MBps) [2024-12-15T05:17:04.997Z] Copying: 476/1024 [MB] (15 MBps) [2024-12-15T05:17:05.958Z] Copying: 502/1024 [MB] (26 MBps) [2024-12-15T05:17:06.901Z] Copying: 518/1024 [MB] (16 MBps) [2024-12-15T05:17:07.844Z] Copying: 546/1024 [MB] (27 MBps) [2024-12-15T05:17:08.789Z] Copying: 564/1024 [MB] (18 MBps) [2024-12-15T05:17:09.735Z] Copying: 594/1024 [MB] (30 MBps) [2024-12-15T05:17:10.681Z] Copying: 618/1024 [MB] (23 MBps) [2024-12-15T05:17:12.088Z] Copying: 635/1024 [MB] (16 MBps) [2024-12-15T05:17:12.660Z] Copying: 662/1024 [MB] (26 MBps) [2024-12-15T05:17:14.048Z] Copying: 686/1024 [MB] (24 MBps) [2024-12-15T05:17:14.993Z] Copying: 715/1024 [MB] (28 MBps) [2024-12-15T05:17:15.936Z] Copying: 745/1024 [MB] (29 MBps) [2024-12-15T05:17:16.880Z] Copying: 779/1024 [MB] (33 MBps) [2024-12-15T05:17:17.823Z] Copying: 798/1024 [MB] (18 MBps) [2024-12-15T05:17:18.932Z] Copying: 822/1024 [MB] (24 MBps) [2024-12-15T05:17:19.874Z] Copying: 846/1024 [MB] (24 MBps) [2024-12-15T05:17:20.827Z] Copying: 878/1024 [MB] (31 MBps) [2024-12-15T05:17:21.788Z] Copying: 902/1024 [MB] (23 MBps) [2024-12-15T05:17:22.731Z] Copying: 931/1024 [MB] (29 MBps) [2024-12-15T05:17:23.674Z] Copying: 956/1024 [MB] (24 MBps) [2024-12-15T05:17:25.062Z] Copying: 983/1024 [MB] (27 MBps) [2024-12-15T05:17:26.012Z] Copying: 999/1024 [MB] (15 MBps) [2024-12-15T05:17:26.012Z] Copying: 1023/1024 [MB] (24 MBps) [2024-12-15T05:17:26.586Z] Copying: 1024/1024 [MB] (average 23 MBps)[2024-12-15 05:17:26.273925] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.446 [2024-12-15 05:17:26.274377] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:27:06.446 [2024-12-15 05:17:26.274412] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:27:06.446 [2024-12-15 05:17:26.274427] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.446 [2024-12-15 05:17:26.274515] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:27:06.446 [2024-12-15 05:17:26.275425] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.446 [2024-12-15 05:17:26.275491] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:27:06.446 [2024-12-15 05:17:26.275508] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.881 ms 00:27:06.446 [2024-12-15 05:17:26.275522] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.446 [2024-12-15 05:17:26.275991] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.446 [2024-12-15 05:17:26.276042] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:27:06.446 [2024-12-15 05:17:26.276059] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.421 ms 00:27:06.446 [2024-12-15 05:17:26.276073] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.446 [2024-12-15 05:17:26.292975] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.446 [2024-12-15 05:17:26.293029] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:27:06.446 [2024-12-15 05:17:26.293057] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 16.875 ms 00:27:06.446 [2024-12-15 05:17:26.293069] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.446 [2024-12-15 05:17:26.299841] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.446 [2024-12-15 05:17:26.299894] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:27:06.446 [2024-12-15 05:17:26.299910] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.718 ms 00:27:06.446 [2024-12-15 05:17:26.299923] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.446 [2024-12-15 05:17:26.303308] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.446 [2024-12-15 05:17:26.303365] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:27:06.446 [2024-12-15 05:17:26.303381] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.307 ms 00:27:06.446 [2024-12-15 05:17:26.303392] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.446 [2024-12-15 05:17:26.308838] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.446 [2024-12-15 05:17:26.308902] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:27:06.446 [2024-12-15 05:17:26.308916] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.371 ms 00:27:06.446 [2024-12-15 05:17:26.308929] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.446 [2024-12-15 05:17:26.314149] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.446 [2024-12-15 05:17:26.314322] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:27:06.446 [2024-12-15 05:17:26.314348] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.984 ms 00:27:06.446 [2024-12-15 05:17:26.314374] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.446 [2024-12-15 05:17:26.317413] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.446 [2024-12-15 05:17:26.317480] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:27:06.446 [2024-12-15 05:17:26.317498] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.008 ms 00:27:06.446 [2024-12-15 05:17:26.317509] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.446 [2024-12-15 05:17:26.320308] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.446 [2024-12-15 05:17:26.320364] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:27:06.446 [2024-12-15 05:17:26.320380] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.740 ms 00:27:06.446 [2024-12-15 05:17:26.320392] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.446 [2024-12-15 05:17:26.322620] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.446 [2024-12-15 05:17:26.322675] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:27:06.446 [2024-12-15 05:17:26.322690] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.158 ms 00:27:06.446 [2024-12-15 05:17:26.322701] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.446 [2024-12-15 05:17:26.324811] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.446 [2024-12-15 05:17:26.324981] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:27:06.446 [2024-12-15 05:17:26.325003] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.019 ms 00:27:06.446 [2024-12-15 05:17:26.325014] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.446 [2024-12-15 05:17:26.325089] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:27:06.446 [2024-12-15 05:17:26.325115] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 261120 / 261120 wr_cnt: 1 state: closed 00:27:06.446 [2024-12-15 05:17:26.325133] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 1536 / 261120 wr_cnt: 1 state: open 00:27:06.447 [2024-12-15 05:17:26.325146] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325159] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325174] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325187] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325200] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325214] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325227] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325240] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325254] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325267] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325280] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325293] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325306] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325319] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325332] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325344] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325357] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325370] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325383] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325396] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325417] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325430] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325469] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325482] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325495] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325510] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325524] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325538] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325552] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325565] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325579] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325592] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325605] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325617] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325630] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325645] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325658] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325671] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325684] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325697] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325710] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325723] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325737] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325751] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325774] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325788] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325801] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325814] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325829] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325842] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325855] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325870] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325883] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325897] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325910] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325924] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325936] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325949] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325963] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325975] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.325988] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326000] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326013] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326026] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326039] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326052] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326064] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326079] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326092] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326104] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326117] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326131] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326143] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326156] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326171] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326184] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:27:06.447 [2024-12-15 05:17:26.326199] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326212] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326225] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326238] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326258] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326272] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326285] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326297] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326311] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326325] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326338] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326351] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326364] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326377] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326391] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326404] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326417] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326444] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326459] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326472] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326485] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326499] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:27:06.448 [2024-12-15 05:17:26.326524] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:27:06.448 [2024-12-15 05:17:26.326558] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: a3f0cc82-b3b1-4076-8939-aceeb9079b03 00:27:06.448 [2024-12-15 05:17:26.326573] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 262656 00:27:06.448 [2024-12-15 05:17:26.326586] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 166080 00:27:06.448 [2024-12-15 05:17:26.326600] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 164096 00:27:06.448 [2024-12-15 05:17:26.326615] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: 1.0121 00:27:06.448 [2024-12-15 05:17:26.326628] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:27:06.448 [2024-12-15 05:17:26.326642] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:27:06.448 [2024-12-15 05:17:26.326656] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:27:06.448 [2024-12-15 05:17:26.326669] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:27:06.448 [2024-12-15 05:17:26.326695] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:27:06.448 [2024-12-15 05:17:26.326708] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.448 [2024-12-15 05:17:26.326722] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:27:06.448 [2024-12-15 05:17:26.326736] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.621 ms 00:27:06.448 [2024-12-15 05:17:26.326751] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.448 [2024-12-15 05:17:26.329297] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.448 [2024-12-15 05:17:26.329339] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:27:06.448 [2024-12-15 05:17:26.329358] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.511 ms 00:27:06.448 [2024-12-15 05:17:26.329373] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.448 [2024-12-15 05:17:26.329544] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:06.448 [2024-12-15 05:17:26.329567] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:27:06.448 [2024-12-15 05:17:26.329586] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.138 ms 00:27:06.448 [2024-12-15 05:17:26.329599] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.448 [2024-12-15 05:17:26.337370] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:27:06.448 [2024-12-15 05:17:26.337423] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:27:06.448 [2024-12-15 05:17:26.337464] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:27:06.448 [2024-12-15 05:17:26.337477] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.448 [2024-12-15 05:17:26.337569] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:27:06.448 [2024-12-15 05:17:26.337588] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:27:06.448 [2024-12-15 05:17:26.337602] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:27:06.448 [2024-12-15 05:17:26.337614] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.448 [2024-12-15 05:17:26.337704] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:27:06.448 [2024-12-15 05:17:26.337721] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:27:06.448 [2024-12-15 05:17:26.337740] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:27:06.448 [2024-12-15 05:17:26.337753] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.448 [2024-12-15 05:17:26.337776] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:27:06.448 [2024-12-15 05:17:26.337790] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:27:06.448 [2024-12-15 05:17:26.337803] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:27:06.448 [2024-12-15 05:17:26.337820] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.448 [2024-12-15 05:17:26.351381] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:27:06.448 [2024-12-15 05:17:26.351604] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:27:06.448 [2024-12-15 05:17:26.351629] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:27:06.448 [2024-12-15 05:17:26.351641] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.448 [2024-12-15 05:17:26.362021] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:27:06.448 [2024-12-15 05:17:26.362073] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:27:06.448 [2024-12-15 05:17:26.362098] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:27:06.448 [2024-12-15 05:17:26.362110] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.448 [2024-12-15 05:17:26.362175] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:27:06.448 [2024-12-15 05:17:26.362189] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:27:06.448 [2024-12-15 05:17:26.362203] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:27:06.448 [2024-12-15 05:17:26.362214] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.448 [2024-12-15 05:17:26.362271] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:27:06.448 [2024-12-15 05:17:26.362286] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:27:06.448 [2024-12-15 05:17:26.362300] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:27:06.448 [2024-12-15 05:17:26.362313] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.448 [2024-12-15 05:17:26.362421] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:27:06.448 [2024-12-15 05:17:26.362469] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:27:06.448 [2024-12-15 05:17:26.362484] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:27:06.448 [2024-12-15 05:17:26.362497] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.448 [2024-12-15 05:17:26.362544] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:27:06.448 [2024-12-15 05:17:26.362559] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:27:06.448 [2024-12-15 05:17:26.362573] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:27:06.448 [2024-12-15 05:17:26.362608] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.448 [2024-12-15 05:17:26.362665] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:27:06.448 [2024-12-15 05:17:26.362684] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:27:06.448 [2024-12-15 05:17:26.362702] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:27:06.448 [2024-12-15 05:17:26.362719] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.448 [2024-12-15 05:17:26.362788] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:27:06.448 [2024-12-15 05:17:26.362811] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:27:06.449 [2024-12-15 05:17:26.362832] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:27:06.449 [2024-12-15 05:17:26.362847] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:06.449 [2024-12-15 05:17:26.363035] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 89.065 ms, result 0 00:27:06.449 00:27:06.449 00:27:06.449 05:17:26 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@94 -- # md5sum -c /home/vagrant/spdk_repo/spdk/test/ftl/testfile.md5 00:27:08.997 /home/vagrant/spdk_repo/spdk/test/ftl/testfile: OK 00:27:08.997 05:17:28 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@95 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=ftl0 --of=/home/vagrant/spdk_repo/spdk/test/ftl/testfile2 --count=262144 --skip=262144 --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:27:08.997 [2024-12-15 05:17:28.680971] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:27:08.997 [2024-12-15 05:17:28.681061] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid94938 ] 00:27:08.997 [2024-12-15 05:17:28.832394] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:08.997 [2024-12-15 05:17:28.856414] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:27:08.997 [2024-12-15 05:17:28.972630] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:27:08.997 [2024-12-15 05:17:28.972727] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:27:08.997 [2024-12-15 05:17:29.133218] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:08.997 [2024-12-15 05:17:29.133280] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:27:08.997 [2024-12-15 05:17:29.133302] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:27:08.997 [2024-12-15 05:17:29.133315] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:08.997 [2024-12-15 05:17:29.133389] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:08.997 [2024-12-15 05:17:29.133406] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:27:08.997 [2024-12-15 05:17:29.133419] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.048 ms 00:27:08.997 [2024-12-15 05:17:29.133452] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:08.997 [2024-12-15 05:17:29.133496] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:27:08.997 [2024-12-15 05:17:29.133905] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:27:08.997 [2024-12-15 05:17:29.133947] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:08.997 [2024-12-15 05:17:29.133961] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:27:08.997 [2024-12-15 05:17:29.133979] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.467 ms 00:27:08.997 [2024-12-15 05:17:29.133990] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.260 [2024-12-15 05:17:29.135837] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 0, shm_clean 0 00:27:09.260 [2024-12-15 05:17:29.139723] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.260 [2024-12-15 05:17:29.139778] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:27:09.260 [2024-12-15 05:17:29.139802] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.889 ms 00:27:09.260 [2024-12-15 05:17:29.139817] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.260 [2024-12-15 05:17:29.139911] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.260 [2024-12-15 05:17:29.139928] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:27:09.260 [2024-12-15 05:17:29.139943] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.034 ms 00:27:09.260 [2024-12-15 05:17:29.139955] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.260 [2024-12-15 05:17:29.148339] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.260 [2024-12-15 05:17:29.148389] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:27:09.260 [2024-12-15 05:17:29.148407] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.320 ms 00:27:09.260 [2024-12-15 05:17:29.148419] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.260 [2024-12-15 05:17:29.148562] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.260 [2024-12-15 05:17:29.148579] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:27:09.260 [2024-12-15 05:17:29.148597] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.092 ms 00:27:09.260 [2024-12-15 05:17:29.148610] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.260 [2024-12-15 05:17:29.148694] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.260 [2024-12-15 05:17:29.148710] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:27:09.260 [2024-12-15 05:17:29.148724] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.011 ms 00:27:09.260 [2024-12-15 05:17:29.148740] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.260 [2024-12-15 05:17:29.148775] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:27:09.260 [2024-12-15 05:17:29.150849] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.260 [2024-12-15 05:17:29.151020] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:27:09.260 [2024-12-15 05:17:29.151041] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.082 ms 00:27:09.260 [2024-12-15 05:17:29.151053] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.260 [2024-12-15 05:17:29.151114] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.260 [2024-12-15 05:17:29.151128] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:27:09.260 [2024-12-15 05:17:29.151147] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.018 ms 00:27:09.260 [2024-12-15 05:17:29.151167] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.260 [2024-12-15 05:17:29.151205] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:27:09.260 [2024-12-15 05:17:29.151239] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:27:09.260 [2024-12-15 05:17:29.151297] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:27:09.260 [2024-12-15 05:17:29.151328] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:27:09.260 [2024-12-15 05:17:29.151501] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:27:09.260 [2024-12-15 05:17:29.151521] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:27:09.260 [2024-12-15 05:17:29.151543] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:27:09.260 [2024-12-15 05:17:29.151560] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:27:09.260 [2024-12-15 05:17:29.151576] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:27:09.260 [2024-12-15 05:17:29.151589] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:27:09.260 [2024-12-15 05:17:29.151603] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:27:09.260 [2024-12-15 05:17:29.151615] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:27:09.260 [2024-12-15 05:17:29.151627] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:27:09.260 [2024-12-15 05:17:29.151640] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.260 [2024-12-15 05:17:29.151653] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:27:09.260 [2024-12-15 05:17:29.151669] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.439 ms 00:27:09.260 [2024-12-15 05:17:29.151682] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.260 [2024-12-15 05:17:29.151809] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.260 [2024-12-15 05:17:29.151823] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:27:09.260 [2024-12-15 05:17:29.151842] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.094 ms 00:27:09.260 [2024-12-15 05:17:29.151854] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.260 [2024-12-15 05:17:29.151991] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:27:09.260 [2024-12-15 05:17:29.152008] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:27:09.260 [2024-12-15 05:17:29.152026] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:27:09.260 [2024-12-15 05:17:29.152051] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:27:09.260 [2024-12-15 05:17:29.152063] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:27:09.260 [2024-12-15 05:17:29.152076] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:27:09.260 [2024-12-15 05:17:29.152088] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:27:09.260 [2024-12-15 05:17:29.152099] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:27:09.260 [2024-12-15 05:17:29.152113] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:27:09.260 [2024-12-15 05:17:29.152128] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:27:09.260 [2024-12-15 05:17:29.152141] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:27:09.260 [2024-12-15 05:17:29.152153] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:27:09.260 [2024-12-15 05:17:29.152165] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:27:09.260 [2024-12-15 05:17:29.152177] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:27:09.260 [2024-12-15 05:17:29.152189] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:27:09.260 [2024-12-15 05:17:29.152202] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:27:09.260 [2024-12-15 05:17:29.152214] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:27:09.260 [2024-12-15 05:17:29.152258] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:27:09.261 [2024-12-15 05:17:29.152269] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:27:09.261 [2024-12-15 05:17:29.152281] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:27:09.261 [2024-12-15 05:17:29.152293] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:27:09.261 [2024-12-15 05:17:29.152305] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:27:09.261 [2024-12-15 05:17:29.152317] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:27:09.261 [2024-12-15 05:17:29.152328] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:27:09.261 [2024-12-15 05:17:29.152339] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:27:09.261 [2024-12-15 05:17:29.152358] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:27:09.261 [2024-12-15 05:17:29.152370] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:27:09.261 [2024-12-15 05:17:29.152381] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:27:09.261 [2024-12-15 05:17:29.152393] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:27:09.261 [2024-12-15 05:17:29.152405] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:27:09.261 [2024-12-15 05:17:29.152416] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:27:09.261 [2024-12-15 05:17:29.152428] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:27:09.261 [2024-12-15 05:17:29.152456] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:27:09.261 [2024-12-15 05:17:29.152468] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:27:09.261 [2024-12-15 05:17:29.152479] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:27:09.261 [2024-12-15 05:17:29.152491] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:27:09.261 [2024-12-15 05:17:29.152504] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:27:09.261 [2024-12-15 05:17:29.152518] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:27:09.261 [2024-12-15 05:17:29.152530] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:27:09.261 [2024-12-15 05:17:29.152542] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:27:09.261 [2024-12-15 05:17:29.152554] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:27:09.261 [2024-12-15 05:17:29.152570] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:27:09.261 [2024-12-15 05:17:29.152582] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:27:09.261 [2024-12-15 05:17:29.152594] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:27:09.261 [2024-12-15 05:17:29.152610] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:27:09.261 [2024-12-15 05:17:29.152623] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:27:09.261 [2024-12-15 05:17:29.152636] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:27:09.261 [2024-12-15 05:17:29.152649] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:27:09.261 [2024-12-15 05:17:29.152661] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:27:09.261 [2024-12-15 05:17:29.152672] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:27:09.261 [2024-12-15 05:17:29.152684] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:27:09.261 [2024-12-15 05:17:29.152695] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:27:09.261 [2024-12-15 05:17:29.152707] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:27:09.261 [2024-12-15 05:17:29.152722] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:27:09.261 [2024-12-15 05:17:29.152736] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:27:09.261 [2024-12-15 05:17:29.152756] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:27:09.261 [2024-12-15 05:17:29.152769] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:27:09.261 [2024-12-15 05:17:29.152783] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:27:09.261 [2024-12-15 05:17:29.152796] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:27:09.261 [2024-12-15 05:17:29.152809] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:27:09.261 [2024-12-15 05:17:29.152822] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:27:09.261 [2024-12-15 05:17:29.152835] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:27:09.261 [2024-12-15 05:17:29.152848] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:27:09.261 [2024-12-15 05:17:29.152860] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:27:09.261 [2024-12-15 05:17:29.152872] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:27:09.261 [2024-12-15 05:17:29.152884] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:27:09.261 [2024-12-15 05:17:29.152897] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:27:09.261 [2024-12-15 05:17:29.152909] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:27:09.261 [2024-12-15 05:17:29.152922] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:27:09.261 [2024-12-15 05:17:29.152935] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:27:09.261 [2024-12-15 05:17:29.152950] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:27:09.261 [2024-12-15 05:17:29.152965] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:27:09.261 [2024-12-15 05:17:29.152978] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:27:09.261 [2024-12-15 05:17:29.152994] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:27:09.261 [2024-12-15 05:17:29.153008] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:27:09.261 [2024-12-15 05:17:29.153021] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.261 [2024-12-15 05:17:29.153033] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:27:09.261 [2024-12-15 05:17:29.153047] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.124 ms 00:27:09.261 [2024-12-15 05:17:29.153067] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.261 [2024-12-15 05:17:29.167835] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.261 [2024-12-15 05:17:29.167887] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:27:09.261 [2024-12-15 05:17:29.167903] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 14.675 ms 00:27:09.261 [2024-12-15 05:17:29.167915] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.261 [2024-12-15 05:17:29.168024] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.261 [2024-12-15 05:17:29.168039] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:27:09.261 [2024-12-15 05:17:29.168053] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.075 ms 00:27:09.261 [2024-12-15 05:17:29.168065] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.261 [2024-12-15 05:17:29.189909] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.261 [2024-12-15 05:17:29.189977] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:27:09.261 [2024-12-15 05:17:29.190001] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 21.761 ms 00:27:09.261 [2024-12-15 05:17:29.190018] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.261 [2024-12-15 05:17:29.190090] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.261 [2024-12-15 05:17:29.190112] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:27:09.261 [2024-12-15 05:17:29.190139] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.006 ms 00:27:09.261 [2024-12-15 05:17:29.190155] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.261 [2024-12-15 05:17:29.190867] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.261 [2024-12-15 05:17:29.190910] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:27:09.261 [2024-12-15 05:17:29.190931] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.606 ms 00:27:09.261 [2024-12-15 05:17:29.190947] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.261 [2024-12-15 05:17:29.191200] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.261 [2024-12-15 05:17:29.191234] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:27:09.261 [2024-12-15 05:17:29.191250] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.205 ms 00:27:09.261 [2024-12-15 05:17:29.191266] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.261 [2024-12-15 05:17:29.200200] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.261 [2024-12-15 05:17:29.200297] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:27:09.262 [2024-12-15 05:17:29.200318] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.886 ms 00:27:09.262 [2024-12-15 05:17:29.200335] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.262 [2024-12-15 05:17:29.203991] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 2, empty chunks = 2 00:27:09.262 [2024-12-15 05:17:29.204045] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:27:09.262 [2024-12-15 05:17:29.204067] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.262 [2024-12-15 05:17:29.204079] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:27:09.262 [2024-12-15 05:17:29.204092] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.491 ms 00:27:09.262 [2024-12-15 05:17:29.204103] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.262 [2024-12-15 05:17:29.220269] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.262 [2024-12-15 05:17:29.220324] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:27:09.262 [2024-12-15 05:17:29.220341] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 16.101 ms 00:27:09.262 [2024-12-15 05:17:29.220354] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.262 [2024-12-15 05:17:29.223370] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.262 [2024-12-15 05:17:29.223422] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:27:09.262 [2024-12-15 05:17:29.223460] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.948 ms 00:27:09.262 [2024-12-15 05:17:29.223473] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.262 [2024-12-15 05:17:29.226128] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.262 [2024-12-15 05:17:29.226288] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:27:09.262 [2024-12-15 05:17:29.226311] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.597 ms 00:27:09.262 [2024-12-15 05:17:29.226333] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.262 [2024-12-15 05:17:29.226825] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.262 [2024-12-15 05:17:29.226864] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:27:09.262 [2024-12-15 05:17:29.226881] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.397 ms 00:27:09.262 [2024-12-15 05:17:29.226906] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.262 [2024-12-15 05:17:29.253257] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.262 [2024-12-15 05:17:29.253487] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:27:09.262 [2024-12-15 05:17:29.253516] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 26.304 ms 00:27:09.262 [2024-12-15 05:17:29.253528] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.262 [2024-12-15 05:17:29.262090] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 9 (of 10) MiB 00:27:09.262 [2024-12-15 05:17:29.265292] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.262 [2024-12-15 05:17:29.265487] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:27:09.262 [2024-12-15 05:17:29.265512] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 11.614 ms 00:27:09.262 [2024-12-15 05:17:29.265529] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.262 [2024-12-15 05:17:29.265631] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.262 [2024-12-15 05:17:29.265648] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:27:09.262 [2024-12-15 05:17:29.265670] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.015 ms 00:27:09.262 [2024-12-15 05:17:29.265683] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.262 [2024-12-15 05:17:29.266560] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.262 [2024-12-15 05:17:29.266606] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:27:09.262 [2024-12-15 05:17:29.266621] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.821 ms 00:27:09.262 [2024-12-15 05:17:29.266632] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.262 [2024-12-15 05:17:29.266675] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.262 [2024-12-15 05:17:29.266689] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:27:09.262 [2024-12-15 05:17:29.266702] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.006 ms 00:27:09.262 [2024-12-15 05:17:29.266718] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.262 [2024-12-15 05:17:29.266769] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:27:09.262 [2024-12-15 05:17:29.266785] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.262 [2024-12-15 05:17:29.266806] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:27:09.262 [2024-12-15 05:17:29.266823] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.017 ms 00:27:09.262 [2024-12-15 05:17:29.266836] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.262 [2024-12-15 05:17:29.272620] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.262 [2024-12-15 05:17:29.272791] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:27:09.262 [2024-12-15 05:17:29.272824] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.752 ms 00:27:09.262 [2024-12-15 05:17:29.272837] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.262 [2024-12-15 05:17:29.273016] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:27:09.262 [2024-12-15 05:17:29.273046] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:27:09.262 [2024-12-15 05:17:29.273062] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.054 ms 00:27:09.262 [2024-12-15 05:17:29.273081] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:27:09.262 [2024-12-15 05:17:29.274301] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 140.614 ms, result 0 00:27:10.648  [2024-12-15T05:17:31.735Z] Copying: 27/1024 [MB] (27 MBps) [2024-12-15T05:17:32.678Z] Copying: 43/1024 [MB] (16 MBps) [2024-12-15T05:17:33.621Z] Copying: 64/1024 [MB] (20 MBps) [2024-12-15T05:17:34.563Z] Copying: 82/1024 [MB] (18 MBps) [2024-12-15T05:17:35.504Z] Copying: 97/1024 [MB] (14 MBps) [2024-12-15T05:17:36.889Z] Copying: 115/1024 [MB] (18 MBps) [2024-12-15T05:17:37.461Z] Copying: 137/1024 [MB] (21 MBps) [2024-12-15T05:17:38.846Z] Copying: 152/1024 [MB] (15 MBps) [2024-12-15T05:17:39.789Z] Copying: 168/1024 [MB] (15 MBps) [2024-12-15T05:17:40.733Z] Copying: 185/1024 [MB] (16 MBps) [2024-12-15T05:17:41.676Z] Copying: 204/1024 [MB] (18 MBps) [2024-12-15T05:17:42.620Z] Copying: 222/1024 [MB] (18 MBps) [2024-12-15T05:17:43.565Z] Copying: 236/1024 [MB] (14 MBps) [2024-12-15T05:17:44.574Z] Copying: 256/1024 [MB] (19 MBps) [2024-12-15T05:17:45.514Z] Copying: 268/1024 [MB] (11 MBps) [2024-12-15T05:17:46.454Z] Copying: 287/1024 [MB] (19 MBps) [2024-12-15T05:17:47.839Z] Copying: 310/1024 [MB] (23 MBps) [2024-12-15T05:17:48.783Z] Copying: 333/1024 [MB] (22 MBps) [2024-12-15T05:17:49.726Z] Copying: 347/1024 [MB] (14 MBps) [2024-12-15T05:17:50.671Z] Copying: 365/1024 [MB] (18 MBps) [2024-12-15T05:17:51.615Z] Copying: 384/1024 [MB] (18 MBps) [2024-12-15T05:17:52.559Z] Copying: 401/1024 [MB] (16 MBps) [2024-12-15T05:17:53.503Z] Copying: 421/1024 [MB] (19 MBps) [2024-12-15T05:17:54.890Z] Copying: 442/1024 [MB] (20 MBps) [2024-12-15T05:17:55.461Z] Copying: 460/1024 [MB] (18 MBps) [2024-12-15T05:17:56.849Z] Copying: 478/1024 [MB] (17 MBps) [2024-12-15T05:17:57.792Z] Copying: 488/1024 [MB] (10 MBps) [2024-12-15T05:17:58.734Z] Copying: 499/1024 [MB] (10 MBps) [2024-12-15T05:17:59.676Z] Copying: 510/1024 [MB] (10 MBps) [2024-12-15T05:18:00.621Z] Copying: 520/1024 [MB] (10 MBps) [2024-12-15T05:18:01.569Z] Copying: 536/1024 [MB] (15 MBps) [2024-12-15T05:18:02.514Z] Copying: 558/1024 [MB] (21 MBps) [2024-12-15T05:18:03.458Z] Copying: 569/1024 [MB] (11 MBps) [2024-12-15T05:18:04.845Z] Copying: 582/1024 [MB] (13 MBps) [2024-12-15T05:18:05.788Z] Copying: 598/1024 [MB] (16 MBps) [2024-12-15T05:18:06.733Z] Copying: 610/1024 [MB] (12 MBps) [2024-12-15T05:18:07.714Z] Copying: 627/1024 [MB] (17 MBps) [2024-12-15T05:18:08.678Z] Copying: 647/1024 [MB] (19 MBps) [2024-12-15T05:18:09.623Z] Copying: 666/1024 [MB] (19 MBps) [2024-12-15T05:18:10.569Z] Copying: 682/1024 [MB] (15 MBps) [2024-12-15T05:18:11.514Z] Copying: 693/1024 [MB] (10 MBps) [2024-12-15T05:18:12.459Z] Copying: 714/1024 [MB] (21 MBps) [2024-12-15T05:18:13.850Z] Copying: 725/1024 [MB] (10 MBps) [2024-12-15T05:18:14.795Z] Copying: 735/1024 [MB] (10 MBps) [2024-12-15T05:18:15.739Z] Copying: 746/1024 [MB] (10 MBps) [2024-12-15T05:18:16.684Z] Copying: 762/1024 [MB] (16 MBps) [2024-12-15T05:18:17.629Z] Copying: 773/1024 [MB] (10 MBps) [2024-12-15T05:18:18.574Z] Copying: 783/1024 [MB] (10 MBps) [2024-12-15T05:18:19.520Z] Copying: 801/1024 [MB] (17 MBps) [2024-12-15T05:18:20.465Z] Copying: 812/1024 [MB] (11 MBps) [2024-12-15T05:18:21.852Z] Copying: 824/1024 [MB] (12 MBps) [2024-12-15T05:18:22.797Z] Copying: 835/1024 [MB] (10 MBps) [2024-12-15T05:18:23.743Z] Copying: 847/1024 [MB] (11 MBps) [2024-12-15T05:18:24.689Z] Copying: 860/1024 [MB] (13 MBps) [2024-12-15T05:18:25.633Z] Copying: 876/1024 [MB] (15 MBps) [2024-12-15T05:18:26.577Z] Copying: 899/1024 [MB] (22 MBps) [2024-12-15T05:18:27.522Z] Copying: 920/1024 [MB] (21 MBps) [2024-12-15T05:18:28.469Z] Copying: 940/1024 [MB] (19 MBps) [2024-12-15T05:18:29.855Z] Copying: 963/1024 [MB] (23 MBps) [2024-12-15T05:18:30.835Z] Copying: 976/1024 [MB] (12 MBps) [2024-12-15T05:18:31.807Z] Copying: 990/1024 [MB] (13 MBps) [2024-12-15T05:18:32.753Z] Copying: 1001/1024 [MB] (11 MBps) [2024-12-15T05:18:32.753Z] Copying: 1016/1024 [MB] (14 MBps) [2024-12-15T05:18:33.327Z] Copying: 1024/1024 [MB] (average 16 MBps)[2024-12-15 05:18:33.130680] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.187 [2024-12-15 05:18:33.130783] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:28:13.187 [2024-12-15 05:18:33.130811] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.006 ms 00:28:13.187 [2024-12-15 05:18:33.130822] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.187 [2024-12-15 05:18:33.130854] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:28:13.187 [2024-12-15 05:18:33.131998] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.187 [2024-12-15 05:18:33.132149] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:28:13.187 [2024-12-15 05:18:33.132281] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.121 ms 00:28:13.187 [2024-12-15 05:18:33.132315] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.187 [2024-12-15 05:18:33.132668] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.187 [2024-12-15 05:18:33.132703] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:28:13.187 [2024-12-15 05:18:33.132731] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.287 ms 00:28:13.187 [2024-12-15 05:18:33.132809] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.187 [2024-12-15 05:18:33.136639] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.187 [2024-12-15 05:18:33.136817] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:28:13.187 [2024-12-15 05:18:33.136836] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.785 ms 00:28:13.187 [2024-12-15 05:18:33.136845] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.187 [2024-12-15 05:18:33.143687] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.187 [2024-12-15 05:18:33.143728] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:28:13.187 [2024-12-15 05:18:33.143740] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.815 ms 00:28:13.187 [2024-12-15 05:18:33.143756] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.187 [2024-12-15 05:18:33.147601] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.187 [2024-12-15 05:18:33.147652] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:28:13.187 [2024-12-15 05:18:33.147663] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.777 ms 00:28:13.187 [2024-12-15 05:18:33.147671] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.187 [2024-12-15 05:18:33.153101] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.187 [2024-12-15 05:18:33.153155] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:28:13.187 [2024-12-15 05:18:33.153167] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.383 ms 00:28:13.187 [2024-12-15 05:18:33.153176] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.187 [2024-12-15 05:18:33.157472] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.187 [2024-12-15 05:18:33.157516] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:28:13.187 [2024-12-15 05:18:33.157543] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.247 ms 00:28:13.187 [2024-12-15 05:18:33.157555] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.187 [2024-12-15 05:18:33.161166] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.187 [2024-12-15 05:18:33.161211] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:28:13.187 [2024-12-15 05:18:33.161221] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.593 ms 00:28:13.187 [2024-12-15 05:18:33.161229] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.187 [2024-12-15 05:18:33.163686] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.187 [2024-12-15 05:18:33.163732] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:28:13.187 [2024-12-15 05:18:33.163743] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.415 ms 00:28:13.187 [2024-12-15 05:18:33.163751] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.187 [2024-12-15 05:18:33.165576] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.187 [2024-12-15 05:18:33.165621] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:28:13.187 [2024-12-15 05:18:33.165631] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.784 ms 00:28:13.187 [2024-12-15 05:18:33.165640] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.187 [2024-12-15 05:18:33.167152] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.187 [2024-12-15 05:18:33.167327] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:28:13.187 [2024-12-15 05:18:33.167346] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.440 ms 00:28:13.187 [2024-12-15 05:18:33.167354] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.187 [2024-12-15 05:18:33.167390] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:28:13.187 [2024-12-15 05:18:33.167408] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 261120 / 261120 wr_cnt: 1 state: closed 00:28:13.187 [2024-12-15 05:18:33.167420] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 1536 / 261120 wr_cnt: 1 state: open 00:28:13.187 [2024-12-15 05:18:33.167430] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:28:13.187 [2024-12-15 05:18:33.167464] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:28:13.187 [2024-12-15 05:18:33.167474] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:28:13.187 [2024-12-15 05:18:33.167483] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:28:13.187 [2024-12-15 05:18:33.167492] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167500] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167508] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167515] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167523] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167532] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167539] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167547] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167556] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167564] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167572] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167579] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167587] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167595] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167603] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167610] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167618] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167626] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167634] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167642] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167649] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167845] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167853] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167861] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167870] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167878] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167886] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167894] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167902] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167910] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167918] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167926] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167934] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167942] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167949] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167957] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167965] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167973] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167981] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167989] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.167998] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168006] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168013] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168021] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168029] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168036] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168044] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168051] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168059] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168066] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168076] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168084] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168091] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168100] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168107] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168115] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168123] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168130] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168137] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168145] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168153] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168161] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168169] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168177] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168184] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168192] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168200] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168207] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168236] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168245] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168253] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168261] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:28:13.188 [2024-12-15 05:18:33.168269] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168277] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168286] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168293] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168302] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168310] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168318] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168326] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168334] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168342] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168351] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168358] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168366] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168373] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168381] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168390] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168398] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168405] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168413] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168421] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168430] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168450] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:28:13.189 [2024-12-15 05:18:33.168467] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:28:13.189 [2024-12-15 05:18:33.168475] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: a3f0cc82-b3b1-4076-8939-aceeb9079b03 00:28:13.189 [2024-12-15 05:18:33.168485] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 262656 00:28:13.189 [2024-12-15 05:18:33.168492] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:28:13.189 [2024-12-15 05:18:33.168501] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:28:13.189 [2024-12-15 05:18:33.168510] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:28:13.189 [2024-12-15 05:18:33.168517] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:28:13.189 [2024-12-15 05:18:33.168537] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:28:13.189 [2024-12-15 05:18:33.168554] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:28:13.189 [2024-12-15 05:18:33.168568] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:28:13.189 [2024-12-15 05:18:33.168576] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:28:13.189 [2024-12-15 05:18:33.168583] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.189 [2024-12-15 05:18:33.168592] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:28:13.189 [2024-12-15 05:18:33.168602] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.194 ms 00:28:13.189 [2024-12-15 05:18:33.168610] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.189 [2024-12-15 05:18:33.170880] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.189 [2024-12-15 05:18:33.170911] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:28:13.189 [2024-12-15 05:18:33.170921] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.250 ms 00:28:13.189 [2024-12-15 05:18:33.170929] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.189 [2024-12-15 05:18:33.171053] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:28:13.189 [2024-12-15 05:18:33.171069] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:28:13.189 [2024-12-15 05:18:33.171078] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.094 ms 00:28:13.189 [2024-12-15 05:18:33.171085] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.189 [2024-12-15 05:18:33.178406] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:28:13.189 [2024-12-15 05:18:33.178615] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:28:13.189 [2024-12-15 05:18:33.178646] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:28:13.189 [2024-12-15 05:18:33.178655] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.189 [2024-12-15 05:18:33.178711] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:28:13.189 [2024-12-15 05:18:33.178719] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:28:13.189 [2024-12-15 05:18:33.178728] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:28:13.189 [2024-12-15 05:18:33.178740] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.189 [2024-12-15 05:18:33.178808] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:28:13.189 [2024-12-15 05:18:33.178819] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:28:13.189 [2024-12-15 05:18:33.178827] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:28:13.189 [2024-12-15 05:18:33.178838] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.189 [2024-12-15 05:18:33.178854] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:28:13.189 [2024-12-15 05:18:33.178863] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:28:13.189 [2024-12-15 05:18:33.178871] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:28:13.189 [2024-12-15 05:18:33.178879] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.189 [2024-12-15 05:18:33.191902] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:28:13.189 [2024-12-15 05:18:33.191950] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:28:13.189 [2024-12-15 05:18:33.191962] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:28:13.189 [2024-12-15 05:18:33.191978] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.189 [2024-12-15 05:18:33.202011] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:28:13.189 [2024-12-15 05:18:33.202191] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:28:13.189 [2024-12-15 05:18:33.202208] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:28:13.189 [2024-12-15 05:18:33.202216] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.189 [2024-12-15 05:18:33.202264] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:28:13.189 [2024-12-15 05:18:33.202275] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:28:13.189 [2024-12-15 05:18:33.202284] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:28:13.189 [2024-12-15 05:18:33.202292] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.189 [2024-12-15 05:18:33.202334] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:28:13.189 [2024-12-15 05:18:33.202344] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:28:13.189 [2024-12-15 05:18:33.202352] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:28:13.189 [2024-12-15 05:18:33.202360] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.189 [2024-12-15 05:18:33.202588] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:28:13.189 [2024-12-15 05:18:33.202622] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:28:13.189 [2024-12-15 05:18:33.202643] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:28:13.189 [2024-12-15 05:18:33.202662] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.189 [2024-12-15 05:18:33.202717] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:28:13.189 [2024-12-15 05:18:33.202791] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:28:13.189 [2024-12-15 05:18:33.202812] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:28:13.189 [2024-12-15 05:18:33.202822] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.190 [2024-12-15 05:18:33.202862] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:28:13.190 [2024-12-15 05:18:33.202871] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:28:13.190 [2024-12-15 05:18:33.202880] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:28:13.190 [2024-12-15 05:18:33.202888] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.190 [2024-12-15 05:18:33.202934] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:28:13.190 [2024-12-15 05:18:33.202944] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:28:13.190 [2024-12-15 05:18:33.202952] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:28:13.190 [2024-12-15 05:18:33.202961] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:28:13.190 [2024-12-15 05:18:33.203083] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 72.396 ms, result 0 00:28:13.451 00:28:13.451 00:28:13.451 05:18:33 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@96 -- # md5sum -c /home/vagrant/spdk_repo/spdk/test/ftl/testfile2.md5 00:28:15.999 /home/vagrant/spdk_repo/spdk/test/ftl/testfile2: OK 00:28:15.999 05:18:35 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@98 -- # trap - SIGINT SIGTERM EXIT 00:28:15.999 05:18:35 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@99 -- # restore_kill 00:28:15.999 05:18:35 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@31 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:28:15.999 05:18:35 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@32 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/testfile 00:28:15.999 05:18:35 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@33 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/testfile2 00:28:15.999 05:18:35 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@34 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/testfile.md5 00:28:15.999 05:18:35 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@35 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/testfile2.md5 00:28:15.999 Process with pid 93129 is not found 00:28:15.999 05:18:35 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@37 -- # killprocess 93129 00:28:15.999 05:18:35 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@954 -- # '[' -z 93129 ']' 00:28:15.999 05:18:35 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@958 -- # kill -0 93129 00:28:15.999 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 958: kill: (93129) - No such process 00:28:15.999 05:18:35 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@981 -- # echo 'Process with pid 93129 is not found' 00:28:15.999 05:18:35 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@38 -- # rmmod nbd 00:28:15.999 05:18:36 ftl.ftl_dirty_shutdown -- ftl/dirty_shutdown.sh@39 -- # remove_shm 00:28:15.999 Remove shared memory files 00:28:15.999 05:18:36 ftl.ftl_dirty_shutdown -- ftl/common.sh@204 -- # echo Remove shared memory files 00:28:15.999 05:18:36 ftl.ftl_dirty_shutdown -- ftl/common.sh@205 -- # rm -f rm -f 00:28:15.999 05:18:36 ftl.ftl_dirty_shutdown -- ftl/common.sh@206 -- # rm -f rm -f 00:28:15.999 05:18:36 ftl.ftl_dirty_shutdown -- ftl/common.sh@207 -- # rm -f rm -f 00:28:15.999 05:18:36 ftl.ftl_dirty_shutdown -- ftl/common.sh@208 -- # rm -f rm -f /dev/shm/iscsi 00:28:15.999 05:18:36 ftl.ftl_dirty_shutdown -- ftl/common.sh@209 -- # rm -f rm -f 00:28:15.999 ************************************ 00:28:15.999 END TEST ftl_dirty_shutdown 00:28:15.999 ************************************ 00:28:15.999 00:28:15.999 real 3m57.587s 00:28:15.999 user 4m20.108s 00:28:15.999 sys 0m26.355s 00:28:15.999 05:18:36 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@1130 -- # xtrace_disable 00:28:15.999 05:18:36 ftl.ftl_dirty_shutdown -- common/autotest_common.sh@10 -- # set +x 00:28:15.999 05:18:36 ftl -- ftl/ftl.sh@78 -- # run_test ftl_upgrade_shutdown /home/vagrant/spdk_repo/spdk/test/ftl/upgrade_shutdown.sh 0000:00:11.0 0000:00:10.0 00:28:15.999 05:18:36 ftl -- common/autotest_common.sh@1105 -- # '[' 4 -le 1 ']' 00:28:15.999 05:18:36 ftl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:28:15.999 05:18:36 ftl -- common/autotest_common.sh@10 -- # set +x 00:28:16.261 ************************************ 00:28:16.261 START TEST ftl_upgrade_shutdown 00:28:16.261 ************************************ 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/ftl/upgrade_shutdown.sh 0000:00:11.0 0000:00:10.0 00:28:16.261 * Looking for test storage... 00:28:16.261 * Found test storage at /home/vagrant/spdk_repo/spdk/test/ftl 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1711 -- # lcov --version 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@333 -- # local ver1 ver1_l 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@334 -- # local ver2 ver2_l 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@336 -- # IFS=.-: 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@336 -- # read -ra ver1 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@337 -- # IFS=.-: 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@337 -- # read -ra ver2 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@338 -- # local 'op=<' 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@340 -- # ver1_l=2 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@341 -- # ver2_l=1 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@344 -- # case "$op" in 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@345 -- # : 1 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@364 -- # (( v = 0 )) 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@365 -- # decimal 1 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@353 -- # local d=1 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@355 -- # echo 1 00:28:16.261 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@365 -- # ver1[v]=1 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@366 -- # decimal 2 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@353 -- # local d=2 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@355 -- # echo 2 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@366 -- # ver2[v]=2 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- scripts/common.sh@368 -- # return 0 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:28:16.262 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:28:16.262 --rc genhtml_branch_coverage=1 00:28:16.262 --rc genhtml_function_coverage=1 00:28:16.262 --rc genhtml_legend=1 00:28:16.262 --rc geninfo_all_blocks=1 00:28:16.262 --rc geninfo_unexecuted_blocks=1 00:28:16.262 00:28:16.262 ' 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:28:16.262 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:28:16.262 --rc genhtml_branch_coverage=1 00:28:16.262 --rc genhtml_function_coverage=1 00:28:16.262 --rc genhtml_legend=1 00:28:16.262 --rc geninfo_all_blocks=1 00:28:16.262 --rc geninfo_unexecuted_blocks=1 00:28:16.262 00:28:16.262 ' 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:28:16.262 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:28:16.262 --rc genhtml_branch_coverage=1 00:28:16.262 --rc genhtml_function_coverage=1 00:28:16.262 --rc genhtml_legend=1 00:28:16.262 --rc geninfo_all_blocks=1 00:28:16.262 --rc geninfo_unexecuted_blocks=1 00:28:16.262 00:28:16.262 ' 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:28:16.262 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:28:16.262 --rc genhtml_branch_coverage=1 00:28:16.262 --rc genhtml_function_coverage=1 00:28:16.262 --rc genhtml_legend=1 00:28:16.262 --rc geninfo_all_blocks=1 00:28:16.262 --rc geninfo_unexecuted_blocks=1 00:28:16.262 00:28:16.262 ' 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@8 -- # source /home/vagrant/spdk_repo/spdk/test/ftl/common.sh 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@8 -- # dirname /home/vagrant/spdk_repo/spdk/test/ftl/upgrade_shutdown.sh 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@8 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@8 -- # testdir=/home/vagrant/spdk_repo/spdk/test/ftl 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@9 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl/../.. 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@9 -- # rootdir=/home/vagrant/spdk_repo/spdk 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@10 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@12 -- # export 'ftl_tgt_core_mask=[0]' 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@12 -- # ftl_tgt_core_mask='[0]' 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@14 -- # export spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@14 -- # spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@15 -- # export 'spdk_tgt_cpumask=[0]' 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@15 -- # spdk_tgt_cpumask='[0]' 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@16 -- # export spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@16 -- # spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@17 -- # export spdk_tgt_pid= 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@17 -- # spdk_tgt_pid= 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@19 -- # export spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@19 -- # spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@20 -- # export 'spdk_ini_cpumask=[1]' 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@20 -- # spdk_ini_cpumask='[1]' 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@21 -- # export spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@21 -- # spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@22 -- # export spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@22 -- # spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@23 -- # export spdk_ini_pid= 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@23 -- # spdk_ini_pid= 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@25 -- # export spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@25 -- # spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@17 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@19 -- # export FTL_BDEV=ftl 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@19 -- # FTL_BDEV=ftl 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@20 -- # export FTL_BASE=0000:00:11.0 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@20 -- # FTL_BASE=0000:00:11.0 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@21 -- # export FTL_BASE_SIZE=20480 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@21 -- # FTL_BASE_SIZE=20480 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@22 -- # export FTL_CACHE=0000:00:10.0 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@22 -- # FTL_CACHE=0000:00:10.0 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@23 -- # export FTL_CACHE_SIZE=5120 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@23 -- # FTL_CACHE_SIZE=5120 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@24 -- # export FTL_L2P_DRAM_LIMIT=2 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@24 -- # FTL_L2P_DRAM_LIMIT=2 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@26 -- # tcp_target_setup 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@81 -- # local base_bdev= 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@82 -- # local cache_bdev= 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@84 -- # [[ -f /home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json ]] 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@89 -- # spdk_tgt_pid=95690 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@90 -- # export spdk_tgt_pid 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@91 -- # waitforlisten 95690 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- ftl/common.sh@87 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt '--cpumask=[0]' 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@835 -- # '[' -z 95690 ']' 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@840 -- # local max_retries=100 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:16.262 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@844 -- # xtrace_disable 00:28:16.262 05:18:36 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@10 -- # set +x 00:28:16.262 [2024-12-15 05:18:36.393569] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:28:16.262 [2024-12-15 05:18:36.393885] [ DPDK EAL parameters: spdk_tgt --no-shconf -l 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid95690 ] 00:28:16.523 [2024-12-15 05:18:36.558407] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:16.523 [2024-12-15 05:18:36.587143] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:28:17.466 05:18:37 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:28:17.466 05:18:37 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@868 -- # return 0 00:28:17.466 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@93 -- # [[ -f /home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json ]] 00:28:17.466 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@99 -- # params=('FTL_BDEV' 'FTL_BASE' 'FTL_BASE_SIZE' 'FTL_CACHE' 'FTL_CACHE_SIZE' 'FTL_L2P_DRAM_LIMIT') 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@99 -- # local params 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@100 -- # for param in "${params[@]}" 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@101 -- # [[ -z ftl ]] 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@100 -- # for param in "${params[@]}" 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@101 -- # [[ -z 0000:00:11.0 ]] 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@100 -- # for param in "${params[@]}" 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@101 -- # [[ -z 20480 ]] 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@100 -- # for param in "${params[@]}" 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@101 -- # [[ -z 0000:00:10.0 ]] 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@100 -- # for param in "${params[@]}" 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@101 -- # [[ -z 5120 ]] 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@100 -- # for param in "${params[@]}" 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@101 -- # [[ -z 2 ]] 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@107 -- # create_base_bdev base 0000:00:11.0 20480 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@54 -- # local name=base 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@55 -- # local base_bdf=0000:00:11.0 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@56 -- # local size=20480 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@59 -- # local base_bdev 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@60 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b base -t PCIe -a 0000:00:11.0 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@60 -- # base_bdev=basen1 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@62 -- # local base_size 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@63 -- # get_bdev_size basen1 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1382 -- # local bdev_name=basen1 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1383 -- # local bdev_info 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1384 -- # local bs 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1385 -- # local nb 00:28:17.467 05:18:37 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b basen1 00:28:17.728 05:18:37 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:28:17.728 { 00:28:17.728 "name": "basen1", 00:28:17.728 "aliases": [ 00:28:17.728 "cb6e073a-f478-469a-b117-62fc9ead1a07" 00:28:17.728 ], 00:28:17.728 "product_name": "NVMe disk", 00:28:17.728 "block_size": 4096, 00:28:17.728 "num_blocks": 1310720, 00:28:17.728 "uuid": "cb6e073a-f478-469a-b117-62fc9ead1a07", 00:28:17.728 "numa_id": -1, 00:28:17.728 "assigned_rate_limits": { 00:28:17.728 "rw_ios_per_sec": 0, 00:28:17.728 "rw_mbytes_per_sec": 0, 00:28:17.728 "r_mbytes_per_sec": 0, 00:28:17.728 "w_mbytes_per_sec": 0 00:28:17.728 }, 00:28:17.728 "claimed": true, 00:28:17.728 "claim_type": "read_many_write_one", 00:28:17.728 "zoned": false, 00:28:17.728 "supported_io_types": { 00:28:17.728 "read": true, 00:28:17.728 "write": true, 00:28:17.728 "unmap": true, 00:28:17.728 "flush": true, 00:28:17.728 "reset": true, 00:28:17.728 "nvme_admin": true, 00:28:17.728 "nvme_io": true, 00:28:17.728 "nvme_io_md": false, 00:28:17.728 "write_zeroes": true, 00:28:17.728 "zcopy": false, 00:28:17.728 "get_zone_info": false, 00:28:17.728 "zone_management": false, 00:28:17.728 "zone_append": false, 00:28:17.728 "compare": true, 00:28:17.728 "compare_and_write": false, 00:28:17.728 "abort": true, 00:28:17.728 "seek_hole": false, 00:28:17.728 "seek_data": false, 00:28:17.728 "copy": true, 00:28:17.728 "nvme_iov_md": false 00:28:17.728 }, 00:28:17.728 "driver_specific": { 00:28:17.728 "nvme": [ 00:28:17.728 { 00:28:17.728 "pci_address": "0000:00:11.0", 00:28:17.728 "trid": { 00:28:17.728 "trtype": "PCIe", 00:28:17.728 "traddr": "0000:00:11.0" 00:28:17.728 }, 00:28:17.728 "ctrlr_data": { 00:28:17.728 "cntlid": 0, 00:28:17.728 "vendor_id": "0x1b36", 00:28:17.728 "model_number": "QEMU NVMe Ctrl", 00:28:17.728 "serial_number": "12341", 00:28:17.728 "firmware_revision": "8.0.0", 00:28:17.728 "subnqn": "nqn.2019-08.org.qemu:12341", 00:28:17.728 "oacs": { 00:28:17.728 "security": 0, 00:28:17.728 "format": 1, 00:28:17.728 "firmware": 0, 00:28:17.728 "ns_manage": 1 00:28:17.728 }, 00:28:17.728 "multi_ctrlr": false, 00:28:17.728 "ana_reporting": false 00:28:17.728 }, 00:28:17.728 "vs": { 00:28:17.728 "nvme_version": "1.4" 00:28:17.728 }, 00:28:17.728 "ns_data": { 00:28:17.728 "id": 1, 00:28:17.728 "can_share": false 00:28:17.728 } 00:28:17.728 } 00:28:17.728 ], 00:28:17.728 "mp_policy": "active_passive" 00:28:17.728 } 00:28:17.728 } 00:28:17.728 ]' 00:28:17.728 05:18:37 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:28:17.728 05:18:37 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1387 -- # bs=4096 00:28:17.728 05:18:37 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:28:17.728 05:18:37 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1388 -- # nb=1310720 00:28:17.728 05:18:37 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1391 -- # bdev_size=5120 00:28:17.728 05:18:37 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1392 -- # echo 5120 00:28:17.728 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@63 -- # base_size=5120 00:28:17.728 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@64 -- # [[ 20480 -le 5120 ]] 00:28:17.728 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@67 -- # clear_lvols 00:28:17.728 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@28 -- # jq -r '.[] | .uuid' 00:28:17.728 05:18:37 ftl.ftl_upgrade_shutdown -- ftl/common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_get_lvstores 00:28:17.989 05:18:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@28 -- # stores=ae21f3e8-031a-4d3c-a3cb-0384416cd2f7 00:28:17.989 05:18:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@29 -- # for lvs in $stores 00:28:17.989 05:18:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@30 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -u ae21f3e8-031a-4d3c-a3cb-0384416cd2f7 00:28:18.249 05:18:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@68 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_create_lvstore basen1 lvs 00:28:18.510 05:18:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@68 -- # lvs=4dc5f2db-978b-4031-bbe3-e1880a01b422 00:28:18.510 05:18:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@69 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_create basen1p0 20480 -t -u 4dc5f2db-978b-4031-bbe3-e1880a01b422 00:28:18.771 05:18:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@107 -- # base_bdev=efb5d820-d183-4843-aad8-0a0b6f300206 00:28:18.771 05:18:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@108 -- # [[ -z efb5d820-d183-4843-aad8-0a0b6f300206 ]] 00:28:18.771 05:18:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@113 -- # create_nv_cache_bdev cache 0000:00:10.0 efb5d820-d183-4843-aad8-0a0b6f300206 5120 00:28:18.771 05:18:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@35 -- # local name=cache 00:28:18.771 05:18:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@36 -- # local cache_bdf=0000:00:10.0 00:28:18.771 05:18:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@37 -- # local base_bdev=efb5d820-d183-4843-aad8-0a0b6f300206 00:28:18.771 05:18:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@38 -- # local cache_size=5120 00:28:18.771 05:18:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@41 -- # get_bdev_size efb5d820-d183-4843-aad8-0a0b6f300206 00:28:18.771 05:18:38 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1382 -- # local bdev_name=efb5d820-d183-4843-aad8-0a0b6f300206 00:28:18.771 05:18:38 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1383 -- # local bdev_info 00:28:18.771 05:18:38 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1384 -- # local bs 00:28:18.771 05:18:38 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1385 -- # local nb 00:28:18.771 05:18:38 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b efb5d820-d183-4843-aad8-0a0b6f300206 00:28:19.032 05:18:38 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:28:19.032 { 00:28:19.032 "name": "efb5d820-d183-4843-aad8-0a0b6f300206", 00:28:19.032 "aliases": [ 00:28:19.032 "lvs/basen1p0" 00:28:19.032 ], 00:28:19.032 "product_name": "Logical Volume", 00:28:19.032 "block_size": 4096, 00:28:19.032 "num_blocks": 5242880, 00:28:19.032 "uuid": "efb5d820-d183-4843-aad8-0a0b6f300206", 00:28:19.032 "assigned_rate_limits": { 00:28:19.032 "rw_ios_per_sec": 0, 00:28:19.032 "rw_mbytes_per_sec": 0, 00:28:19.032 "r_mbytes_per_sec": 0, 00:28:19.032 "w_mbytes_per_sec": 0 00:28:19.032 }, 00:28:19.032 "claimed": false, 00:28:19.032 "zoned": false, 00:28:19.032 "supported_io_types": { 00:28:19.032 "read": true, 00:28:19.032 "write": true, 00:28:19.032 "unmap": true, 00:28:19.032 "flush": false, 00:28:19.032 "reset": true, 00:28:19.032 "nvme_admin": false, 00:28:19.032 "nvme_io": false, 00:28:19.032 "nvme_io_md": false, 00:28:19.033 "write_zeroes": true, 00:28:19.033 "zcopy": false, 00:28:19.033 "get_zone_info": false, 00:28:19.033 "zone_management": false, 00:28:19.033 "zone_append": false, 00:28:19.033 "compare": false, 00:28:19.033 "compare_and_write": false, 00:28:19.033 "abort": false, 00:28:19.033 "seek_hole": true, 00:28:19.033 "seek_data": true, 00:28:19.033 "copy": false, 00:28:19.033 "nvme_iov_md": false 00:28:19.033 }, 00:28:19.033 "driver_specific": { 00:28:19.033 "lvol": { 00:28:19.033 "lvol_store_uuid": "4dc5f2db-978b-4031-bbe3-e1880a01b422", 00:28:19.033 "base_bdev": "basen1", 00:28:19.033 "thin_provision": true, 00:28:19.033 "num_allocated_clusters": 0, 00:28:19.033 "snapshot": false, 00:28:19.033 "clone": false, 00:28:19.033 "esnap_clone": false 00:28:19.033 } 00:28:19.033 } 00:28:19.033 } 00:28:19.033 ]' 00:28:19.033 05:18:38 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:28:19.033 05:18:38 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1387 -- # bs=4096 00:28:19.033 05:18:38 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:28:19.033 05:18:39 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1388 -- # nb=5242880 00:28:19.033 05:18:39 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1391 -- # bdev_size=20480 00:28:19.033 05:18:39 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1392 -- # echo 20480 00:28:19.033 05:18:39 ftl.ftl_upgrade_shutdown -- ftl/common.sh@41 -- # local base_size=1024 00:28:19.033 05:18:39 ftl.ftl_upgrade_shutdown -- ftl/common.sh@44 -- # local nvc_bdev 00:28:19.033 05:18:39 ftl.ftl_upgrade_shutdown -- ftl/common.sh@45 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b cache -t PCIe -a 0000:00:10.0 00:28:19.294 05:18:39 ftl.ftl_upgrade_shutdown -- ftl/common.sh@45 -- # nvc_bdev=cachen1 00:28:19.294 05:18:39 ftl.ftl_upgrade_shutdown -- ftl/common.sh@47 -- # [[ -z 5120 ]] 00:28:19.294 05:18:39 ftl.ftl_upgrade_shutdown -- ftl/common.sh@50 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_split_create cachen1 -s 5120 1 00:28:19.556 05:18:39 ftl.ftl_upgrade_shutdown -- ftl/common.sh@113 -- # cache_bdev=cachen1p0 00:28:19.556 05:18:39 ftl.ftl_upgrade_shutdown -- ftl/common.sh@114 -- # [[ -z cachen1p0 ]] 00:28:19.556 05:18:39 ftl.ftl_upgrade_shutdown -- ftl/common.sh@119 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -t 60 bdev_ftl_create -b ftl -d efb5d820-d183-4843-aad8-0a0b6f300206 -c cachen1p0 --l2p_dram_limit 2 00:28:19.556 [2024-12-15 05:18:39.620097] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:19.556 [2024-12-15 05:18:39.620140] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Check configuration 00:28:19.556 [2024-12-15 05:18:39.620151] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.003 ms 00:28:19.556 [2024-12-15 05:18:39.620158] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:19.556 [2024-12-15 05:18:39.620205] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:19.556 [2024-12-15 05:18:39.620221] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Open base bdev 00:28:19.556 [2024-12-15 05:18:39.620229] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.034 ms 00:28:19.556 [2024-12-15 05:18:39.620238] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:19.556 [2024-12-15 05:18:39.620254] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl] Using cachen1p0 as write buffer cache 00:28:19.556 [2024-12-15 05:18:39.620463] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl] Using bdev as NV Cache device 00:28:19.557 [2024-12-15 05:18:39.620474] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:19.557 [2024-12-15 05:18:39.620481] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Open cache bdev 00:28:19.557 [2024-12-15 05:18:39.620488] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.223 ms 00:28:19.557 [2024-12-15 05:18:39.620496] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:19.557 [2024-12-15 05:18:39.620517] mngt/ftl_mngt_md.c: 570:ftl_mngt_superblock_init: *NOTICE*: [FTL][ftl] Create new FTL, UUID b0da1c54-e331-4f00-a60a-a11723dff425 00:28:19.557 [2024-12-15 05:18:39.621474] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:19.557 [2024-12-15 05:18:39.621498] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Default-initialize superblock 00:28:19.557 [2024-12-15 05:18:39.621507] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.021 ms 00:28:19.557 [2024-12-15 05:18:39.621513] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:19.557 [2024-12-15 05:18:39.626206] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:19.557 [2024-12-15 05:18:39.626236] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize memory pools 00:28:19.557 [2024-12-15 05:18:39.626244] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 4.653 ms 00:28:19.557 [2024-12-15 05:18:39.626250] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:19.557 [2024-12-15 05:18:39.626317] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:19.557 [2024-12-15 05:18:39.626326] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize bands 00:28:19.557 [2024-12-15 05:18:39.626334] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.016 ms 00:28:19.557 [2024-12-15 05:18:39.626339] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:19.557 [2024-12-15 05:18:39.626381] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:19.557 [2024-12-15 05:18:39.626388] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Register IO device 00:28:19.557 [2024-12-15 05:18:39.626396] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.008 ms 00:28:19.557 [2024-12-15 05:18:39.626401] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:19.557 [2024-12-15 05:18:39.626419] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl] FTL IO channel created on app_thread 00:28:19.557 [2024-12-15 05:18:39.627662] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:19.557 [2024-12-15 05:18:39.627688] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize core IO channel 00:28:19.557 [2024-12-15 05:18:39.627696] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1.248 ms 00:28:19.557 [2024-12-15 05:18:39.627703] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:19.557 [2024-12-15 05:18:39.627725] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:19.557 [2024-12-15 05:18:39.627733] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Decorate bands 00:28:19.557 [2024-12-15 05:18:39.627739] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.004 ms 00:28:19.557 [2024-12-15 05:18:39.627748] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:19.557 [2024-12-15 05:18:39.627760] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl] FTL layout setup mode 1 00:28:19.557 [2024-12-15 05:18:39.627872] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl] nvc layout blob store 0x150 bytes 00:28:19.557 [2024-12-15 05:18:39.627880] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl] base layout blob store 0x48 bytes 00:28:19.557 [2024-12-15 05:18:39.627894] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl] layout blob store 0x190 bytes 00:28:19.557 [2024-12-15 05:18:39.627901] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl] Base device capacity: 20480.00 MiB 00:28:19.557 [2024-12-15 05:18:39.627915] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl] NV cache device capacity: 5120.00 MiB 00:28:19.557 [2024-12-15 05:18:39.627922] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl] L2P entries: 3774873 00:28:19.557 [2024-12-15 05:18:39.627931] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl] L2P address size: 4 00:28:19.557 [2024-12-15 05:18:39.627936] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl] P2L checkpoint pages: 2048 00:28:19.557 [2024-12-15 05:18:39.627943] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl] NV cache chunk count 5 00:28:19.557 [2024-12-15 05:18:39.627948] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:19.557 [2024-12-15 05:18:39.627955] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize layout 00:28:19.557 [2024-12-15 05:18:39.627961] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.189 ms 00:28:19.557 [2024-12-15 05:18:39.627968] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:19.557 [2024-12-15 05:18:39.628033] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:19.557 [2024-12-15 05:18:39.628042] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Verify layout 00:28:19.557 [2024-12-15 05:18:39.628048] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.053 ms 00:28:19.557 [2024-12-15 05:18:39.628056] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:19.557 [2024-12-15 05:18:39.628127] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl] NV cache layout: 00:28:19.557 [2024-12-15 05:18:39.628136] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region sb 00:28:19.557 [2024-12-15 05:18:39.628142] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 0.00 MiB 00:28:19.557 [2024-12-15 05:18:39.628149] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:19.557 [2024-12-15 05:18:39.628155] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region l2p 00:28:19.557 [2024-12-15 05:18:39.628162] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 0.12 MiB 00:28:19.557 [2024-12-15 05:18:39.628167] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 14.50 MiB 00:28:19.557 [2024-12-15 05:18:39.628174] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region band_md 00:28:19.557 [2024-12-15 05:18:39.628179] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 14.62 MiB 00:28:19.557 [2024-12-15 05:18:39.628185] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:19.557 [2024-12-15 05:18:39.628190] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region band_md_mirror 00:28:19.557 [2024-12-15 05:18:39.628196] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 14.75 MiB 00:28:19.557 [2024-12-15 05:18:39.628201] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:19.557 [2024-12-15 05:18:39.628209] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region nvc_md 00:28:19.557 [2024-12-15 05:18:39.628222] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.38 MiB 00:28:19.557 [2024-12-15 05:18:39.628229] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:19.557 [2024-12-15 05:18:39.628235] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region nvc_md_mirror 00:28:19.557 [2024-12-15 05:18:39.628241] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.50 MiB 00:28:19.557 [2024-12-15 05:18:39.628246] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:19.557 [2024-12-15 05:18:39.628253] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region p2l0 00:28:19.557 [2024-12-15 05:18:39.628258] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 14.88 MiB 00:28:19.557 [2024-12-15 05:18:39.628264] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 8.00 MiB 00:28:19.557 [2024-12-15 05:18:39.628270] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region p2l1 00:28:19.557 [2024-12-15 05:18:39.628276] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 22.88 MiB 00:28:19.557 [2024-12-15 05:18:39.628281] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 8.00 MiB 00:28:19.557 [2024-12-15 05:18:39.628288] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region p2l2 00:28:19.557 [2024-12-15 05:18:39.628292] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 30.88 MiB 00:28:19.557 [2024-12-15 05:18:39.628299] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 8.00 MiB 00:28:19.557 [2024-12-15 05:18:39.628305] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region p2l3 00:28:19.557 [2024-12-15 05:18:39.628314] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 38.88 MiB 00:28:19.557 [2024-12-15 05:18:39.628320] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 8.00 MiB 00:28:19.557 [2024-12-15 05:18:39.628327] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region trim_md 00:28:19.557 [2024-12-15 05:18:39.628333] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 46.88 MiB 00:28:19.557 [2024-12-15 05:18:39.628340] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:19.557 [2024-12-15 05:18:39.628346] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region trim_md_mirror 00:28:19.557 [2024-12-15 05:18:39.628353] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.00 MiB 00:28:19.557 [2024-12-15 05:18:39.628359] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:19.557 [2024-12-15 05:18:39.628366] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region trim_log 00:28:19.557 [2024-12-15 05:18:39.628371] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.12 MiB 00:28:19.557 [2024-12-15 05:18:39.628378] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:19.557 [2024-12-15 05:18:39.628384] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region trim_log_mirror 00:28:19.557 [2024-12-15 05:18:39.628391] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.25 MiB 00:28:19.557 [2024-12-15 05:18:39.628397] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:19.557 [2024-12-15 05:18:39.628403] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl] Base device layout: 00:28:19.557 [2024-12-15 05:18:39.628410] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region sb_mirror 00:28:19.558 [2024-12-15 05:18:39.628422] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 0.00 MiB 00:28:19.558 [2024-12-15 05:18:39.628428] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:19.558 [2024-12-15 05:18:39.628446] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region vmap 00:28:19.558 [2024-12-15 05:18:39.628452] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 18432.25 MiB 00:28:19.558 [2024-12-15 05:18:39.628460] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.88 MiB 00:28:19.558 [2024-12-15 05:18:39.628465] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region data_btm 00:28:19.558 [2024-12-15 05:18:39.628473] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 0.25 MiB 00:28:19.558 [2024-12-15 05:18:39.628479] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 18432.00 MiB 00:28:19.558 [2024-12-15 05:18:39.628488] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] SB metadata layout - nvc: 00:28:19.558 [2024-12-15 05:18:39.628498] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:28:19.558 [2024-12-15 05:18:39.628507] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0xe80 00:28:19.558 [2024-12-15 05:18:39.628513] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x3 ver:2 blk_offs:0xea0 blk_sz:0x20 00:28:19.558 [2024-12-15 05:18:39.628521] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x4 ver:2 blk_offs:0xec0 blk_sz:0x20 00:28:19.558 [2024-12-15 05:18:39.628527] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xa ver:2 blk_offs:0xee0 blk_sz:0x800 00:28:19.558 [2024-12-15 05:18:39.628534] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xb ver:2 blk_offs:0x16e0 blk_sz:0x800 00:28:19.558 [2024-12-15 05:18:39.628540] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xc ver:2 blk_offs:0x1ee0 blk_sz:0x800 00:28:19.558 [2024-12-15 05:18:39.628549] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xd ver:2 blk_offs:0x26e0 blk_sz:0x800 00:28:19.558 [2024-12-15 05:18:39.628556] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xe ver:0 blk_offs:0x2ee0 blk_sz:0x20 00:28:19.558 [2024-12-15 05:18:39.628563] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xf ver:0 blk_offs:0x2f00 blk_sz:0x20 00:28:19.558 [2024-12-15 05:18:39.628570] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x10 ver:1 blk_offs:0x2f20 blk_sz:0x20 00:28:19.558 [2024-12-15 05:18:39.628578] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x11 ver:1 blk_offs:0x2f40 blk_sz:0x20 00:28:19.558 [2024-12-15 05:18:39.628584] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x6 ver:2 blk_offs:0x2f60 blk_sz:0x20 00:28:19.558 [2024-12-15 05:18:39.628591] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x7 ver:2 blk_offs:0x2f80 blk_sz:0x20 00:28:19.558 [2024-12-15 05:18:39.628597] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xfffffffe ver:0 blk_offs:0x2fa0 blk_sz:0x13d060 00:28:19.558 [2024-12-15 05:18:39.628604] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] SB metadata layout - base dev: 00:28:19.558 [2024-12-15 05:18:39.628611] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:28:19.558 [2024-12-15 05:18:39.628622] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:28:19.558 [2024-12-15 05:18:39.628628] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x480000 00:28:19.558 [2024-12-15 05:18:39.628636] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x5 ver:0 blk_offs:0x480040 blk_sz:0xe0 00:28:19.558 [2024-12-15 05:18:39.628642] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xfffffffe ver:0 blk_offs:0x480120 blk_sz:0x7fee0 00:28:19.558 [2024-12-15 05:18:39.628649] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:19.558 [2024-12-15 05:18:39.628656] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Layout upgrade 00:28:19.558 [2024-12-15 05:18:39.628666] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.572 ms 00:28:19.558 [2024-12-15 05:18:39.628672] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:19.558 [2024-12-15 05:18:39.628703] mngt/ftl_mngt_misc.c: 165:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl] NV cache data region needs scrubbing, this may take a while. 00:28:19.558 [2024-12-15 05:18:39.628713] mngt/ftl_mngt_misc.c: 166:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl] Scrubbing 5 chunks 00:28:22.861 [2024-12-15 05:18:42.824598] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.824661] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Scrub NV cache 00:28:22.861 [2024-12-15 05:18:42.824678] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 3195.879 ms 00:28:22.861 [2024-12-15 05:18:42.824686] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.832970] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.833015] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize metadata 00:28:22.861 [2024-12-15 05:18:42.833031] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 8.204 ms 00:28:22.861 [2024-12-15 05:18:42.833039] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.833095] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.833104] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize band addresses 00:28:22.861 [2024-12-15 05:18:42.833114] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.014 ms 00:28:22.861 [2024-12-15 05:18:42.833121] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.841752] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.841788] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize NV cache 00:28:22.861 [2024-12-15 05:18:42.841800] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 8.593 ms 00:28:22.861 [2024-12-15 05:18:42.841810] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.841837] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.841844] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize valid map 00:28:22.861 [2024-12-15 05:18:42.841854] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.002 ms 00:28:22.861 [2024-12-15 05:18:42.841861] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.842203] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.842218] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize trim map 00:28:22.861 [2024-12-15 05:18:42.842229] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.307 ms 00:28:22.861 [2024-12-15 05:18:42.842237] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.842280] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.842288] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize bands metadata 00:28:22.861 [2024-12-15 05:18:42.842298] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.019 ms 00:28:22.861 [2024-12-15 05:18:42.842305] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.847834] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.847871] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize reloc 00:28:22.861 [2024-12-15 05:18:42.847882] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 5.508 ms 00:28:22.861 [2024-12-15 05:18:42.847889] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.865832] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 1 (of 2) MiB 00:28:22.861 [2024-12-15 05:18:42.866834] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.866873] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize L2P 00:28:22.861 [2024-12-15 05:18:42.866888] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 18.866 ms 00:28:22.861 [2024-12-15 05:18:42.866900] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.880505] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.880554] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Clear L2P 00:28:22.861 [2024-12-15 05:18:42.880570] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 13.566 ms 00:28:22.861 [2024-12-15 05:18:42.880582] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.880664] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.880682] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Finalize band initialization 00:28:22.861 [2024-12-15 05:18:42.880690] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.045 ms 00:28:22.861 [2024-12-15 05:18:42.880699] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.883548] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.883585] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Save initial band info metadata 00:28:22.861 [2024-12-15 05:18:42.883598] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 2.832 ms 00:28:22.861 [2024-12-15 05:18:42.883607] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.887036] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.887075] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Save initial chunk info metadata 00:28:22.861 [2024-12-15 05:18:42.887084] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 3.395 ms 00:28:22.861 [2024-12-15 05:18:42.887093] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.887380] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.887397] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize P2L checkpointing 00:28:22.861 [2024-12-15 05:18:42.887405] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.256 ms 00:28:22.861 [2024-12-15 05:18:42.887416] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.918686] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.918727] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Wipe P2L region 00:28:22.861 [2024-12-15 05:18:42.918740] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 31.208 ms 00:28:22.861 [2024-12-15 05:18:42.918750] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.923547] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.923587] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Clear trim map 00:28:22.861 [2024-12-15 05:18:42.923596] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 4.756 ms 00:28:22.861 [2024-12-15 05:18:42.923606] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.927614] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.927652] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Clear trim log 00:28:22.861 [2024-12-15 05:18:42.927661] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 3.976 ms 00:28:22.861 [2024-12-15 05:18:42.927670] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.931828] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.931865] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Set FTL dirty state 00:28:22.861 [2024-12-15 05:18:42.931874] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 4.126 ms 00:28:22.861 [2024-12-15 05:18:42.931885] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.931920] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.931931] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Start core poller 00:28:22.861 [2024-12-15 05:18:42.931944] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.005 ms 00:28:22.861 [2024-12-15 05:18:42.931952] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.932012] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:22.861 [2024-12-15 05:18:42.932024] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Finalize initialization 00:28:22.861 [2024-12-15 05:18:42.932032] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.032 ms 00:28:22.861 [2024-12-15 05:18:42.932043] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:22.861 [2024-12-15 05:18:42.933101] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl] Management process finished, name 'FTL startup', duration = 3312.569 ms, result 0 00:28:22.861 { 00:28:22.861 "name": "ftl", 00:28:22.861 "uuid": "b0da1c54-e331-4f00-a60a-a11723dff425" 00:28:22.861 } 00:28:22.861 05:18:42 ftl.ftl_upgrade_shutdown -- ftl/common.sh@121 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py nvmf_create_transport --trtype TCP 00:28:23.123 [2024-12-15 05:18:43.137594] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:28:23.123 05:18:43 ftl.ftl_upgrade_shutdown -- ftl/common.sh@122 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py nvmf_create_subsystem nqn.2018-09.io.spdk:cnode0 -a -m 1 00:28:23.384 05:18:43 ftl.ftl_upgrade_shutdown -- ftl/common.sh@123 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py nvmf_subsystem_add_ns nqn.2018-09.io.spdk:cnode0 ftl 00:28:23.645 [2024-12-15 05:18:43.566002] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl] FTL IO channel created on nvmf_tgt_poll_group_000 00:28:23.645 05:18:43 ftl.ftl_upgrade_shutdown -- ftl/common.sh@124 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py nvmf_subsystem_add_listener nqn.2018-09.io.spdk:cnode0 -t TCP -f ipv4 -s 4420 -a 127.0.0.1 00:28:23.645 [2024-12-15 05:18:43.770425] tcp.c:1099:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:28:23.906 05:18:43 ftl.ftl_upgrade_shutdown -- ftl/common.sh@126 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py save_config 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@28 -- # size=1073741824 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@29 -- # seek=0 00:28:24.168 Fill FTL, iteration 1 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@30 -- # skip=0 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@31 -- # bs=1048576 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@32 -- # count=1024 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@33 -- # iterations=2 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@34 -- # qd=2 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@35 -- # sums=() 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@38 -- # (( i = 0 )) 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@38 -- # (( i < iterations )) 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@39 -- # echo 'Fill FTL, iteration 1' 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@40 -- # tcp_dd --if=/dev/urandom --ob=ftln1 --bs=1048576 --count=1024 --qd=2 --seek=0 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/common.sh@198 -- # tcp_initiator_setup 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/common.sh@151 -- # local 'rpc=/home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk.tgt.sock' 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/common.sh@153 -- # [[ -f /home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json ]] 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/common.sh@157 -- # [[ -z ftl ]] 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/common.sh@163 -- # spdk_ini_pid=95808 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/common.sh@164 -- # export spdk_ini_pid 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/common.sh@165 -- # waitforlisten 95808 /var/tmp/spdk.tgt.sock 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@835 -- # '[' -z 95808 ']' 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.tgt.sock 00:28:24.168 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.tgt.sock... 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@840 -- # local max_retries=100 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.tgt.sock...' 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@844 -- # xtrace_disable 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- ftl/common.sh@162 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt '--cpumask=[1]' --rpc-socket=/var/tmp/spdk.tgt.sock 00:28:24.168 05:18:44 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@10 -- # set +x 00:28:24.168 [2024-12-15 05:18:44.221735] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:28:24.168 [2024-12-15 05:18:44.221879] [ DPDK EAL parameters: spdk_tgt --no-shconf -l 1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid95808 ] 00:28:24.430 [2024-12-15 05:18:44.380573] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:24.430 [2024-12-15 05:18:44.408910] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:28:25.004 05:18:45 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:28:25.004 05:18:45 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@868 -- # return 0 00:28:25.004 05:18:45 ftl.ftl_upgrade_shutdown -- ftl/common.sh@167 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk.tgt.sock bdev_nvme_attach_controller -b ftl -t tcp -a 127.0.0.1 -s 4420 -f ipv4 -n nqn.2018-09.io.spdk:cnode0 00:28:25.274 ftln1 00:28:25.274 05:18:45 ftl.ftl_upgrade_shutdown -- ftl/common.sh@171 -- # echo '{"subsystems": [' 00:28:25.274 05:18:45 ftl.ftl_upgrade_shutdown -- ftl/common.sh@172 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk.tgt.sock save_subsystem_config -n bdev 00:28:25.536 05:18:45 ftl.ftl_upgrade_shutdown -- ftl/common.sh@173 -- # echo ']}' 00:28:25.536 05:18:45 ftl.ftl_upgrade_shutdown -- ftl/common.sh@176 -- # killprocess 95808 00:28:25.536 05:18:45 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@954 -- # '[' -z 95808 ']' 00:28:25.536 05:18:45 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@958 -- # kill -0 95808 00:28:25.536 05:18:45 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@959 -- # uname 00:28:25.536 05:18:45 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:28:25.536 05:18:45 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 95808 00:28:25.536 05:18:45 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@960 -- # process_name=reactor_1 00:28:25.536 killing process with pid 95808 00:28:25.536 05:18:45 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@964 -- # '[' reactor_1 = sudo ']' 00:28:25.536 05:18:45 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@972 -- # echo 'killing process with pid 95808' 00:28:25.536 05:18:45 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@973 -- # kill 95808 00:28:25.536 05:18:45 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@978 -- # wait 95808 00:28:25.797 05:18:45 ftl.ftl_upgrade_shutdown -- ftl/common.sh@177 -- # unset spdk_ini_pid 00:28:25.797 05:18:45 ftl.ftl_upgrade_shutdown -- ftl/common.sh@199 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd '--cpumask=[1]' --rpc-socket=/var/tmp/spdk.tgt.sock --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json --if=/dev/urandom --ob=ftln1 --bs=1048576 --count=1024 --qd=2 --seek=0 00:28:25.797 [2024-12-15 05:18:45.861332] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:28:25.797 [2024-12-15 05:18:45.861479] [ DPDK EAL parameters: spdk_dd --no-shconf -l 1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid95840 ] 00:28:26.058 [2024-12-15 05:18:46.019098] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:26.058 [2024-12-15 05:18:46.037001] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:28:27.444  [2024-12-15T05:18:48.525Z] Copying: 192/1024 [MB] (192 MBps) [2024-12-15T05:18:49.468Z] Copying: 419/1024 [MB] (227 MBps) [2024-12-15T05:18:50.410Z] Copying: 674/1024 [MB] (255 MBps) [2024-12-15T05:18:50.671Z] Copying: 923/1024 [MB] (249 MBps) [2024-12-15T05:18:50.931Z] Copying: 1024/1024 [MB] (average 230 MBps) 00:28:30.791 00:28:30.791 Calculate MD5 checksum, iteration 1 00:28:30.791 05:18:50 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@41 -- # seek=1024 00:28:30.791 05:18:50 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@43 -- # echo 'Calculate MD5 checksum, iteration 1' 00:28:30.791 05:18:50 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@44 -- # tcp_dd --ib=ftln1 --of=/home/vagrant/spdk_repo/spdk/test/ftl/file --bs=1048576 --count=1024 --qd=2 --skip=0 00:28:30.791 05:18:50 ftl.ftl_upgrade_shutdown -- ftl/common.sh@198 -- # tcp_initiator_setup 00:28:30.791 05:18:50 ftl.ftl_upgrade_shutdown -- ftl/common.sh@151 -- # local 'rpc=/home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk.tgt.sock' 00:28:30.791 05:18:50 ftl.ftl_upgrade_shutdown -- ftl/common.sh@153 -- # [[ -f /home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json ]] 00:28:30.791 05:18:50 ftl.ftl_upgrade_shutdown -- ftl/common.sh@154 -- # return 0 00:28:30.791 05:18:50 ftl.ftl_upgrade_shutdown -- ftl/common.sh@199 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd '--cpumask=[1]' --rpc-socket=/var/tmp/spdk.tgt.sock --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json --ib=ftln1 --of=/home/vagrant/spdk_repo/spdk/test/ftl/file --bs=1048576 --count=1024 --qd=2 --skip=0 00:28:30.791 [2024-12-15 05:18:50.862375] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:28:30.791 [2024-12-15 05:18:50.862498] [ DPDK EAL parameters: spdk_dd --no-shconf -l 1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid95898 ] 00:28:31.052 [2024-12-15 05:18:51.017353] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:31.052 [2024-12-15 05:18:51.033535] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:28:32.441  [2024-12-15T05:18:52.842Z] Copying: 680/1024 [MB] (680 MBps) [2024-12-15T05:18:53.104Z] Copying: 1024/1024 [MB] (average 664 MBps) 00:28:32.964 00:28:32.964 05:18:52 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@45 -- # skip=1024 00:28:32.964 05:18:52 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@47 -- # md5sum /home/vagrant/spdk_repo/spdk/test/ftl/file 00:28:35.543 05:18:55 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@48 -- # cut -f1 '-d ' 00:28:35.543 Fill FTL, iteration 2 00:28:35.543 05:18:55 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@48 -- # sums[i]=7277b8a181cfd328f6ae0145f52da3f0 00:28:35.543 05:18:55 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@38 -- # (( i++ )) 00:28:35.543 05:18:55 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@38 -- # (( i < iterations )) 00:28:35.543 05:18:55 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@39 -- # echo 'Fill FTL, iteration 2' 00:28:35.543 05:18:55 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@40 -- # tcp_dd --if=/dev/urandom --ob=ftln1 --bs=1048576 --count=1024 --qd=2 --seek=1024 00:28:35.543 05:18:55 ftl.ftl_upgrade_shutdown -- ftl/common.sh@198 -- # tcp_initiator_setup 00:28:35.543 05:18:55 ftl.ftl_upgrade_shutdown -- ftl/common.sh@151 -- # local 'rpc=/home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk.tgt.sock' 00:28:35.543 05:18:55 ftl.ftl_upgrade_shutdown -- ftl/common.sh@153 -- # [[ -f /home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json ]] 00:28:35.543 05:18:55 ftl.ftl_upgrade_shutdown -- ftl/common.sh@154 -- # return 0 00:28:35.543 05:18:55 ftl.ftl_upgrade_shutdown -- ftl/common.sh@199 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd '--cpumask=[1]' --rpc-socket=/var/tmp/spdk.tgt.sock --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json --if=/dev/urandom --ob=ftln1 --bs=1048576 --count=1024 --qd=2 --seek=1024 00:28:35.543 [2024-12-15 05:18:55.117060] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:28:35.543 [2024-12-15 05:18:55.117175] [ DPDK EAL parameters: spdk_dd --no-shconf -l 1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid95944 ] 00:28:35.543 [2024-12-15 05:18:55.270372] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:35.543 [2024-12-15 05:18:55.287062] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:28:36.486  [2024-12-15T05:18:57.566Z] Copying: 255/1024 [MB] (255 MBps) [2024-12-15T05:18:58.510Z] Copying: 498/1024 [MB] (243 MBps) [2024-12-15T05:18:59.897Z] Copying: 743/1024 [MB] (245 MBps) [2024-12-15T05:18:59.897Z] Copying: 991/1024 [MB] (248 MBps) [2024-12-15T05:18:59.897Z] Copying: 1024/1024 [MB] (average 247 MBps) 00:28:39.757 00:28:39.757 Calculate MD5 checksum, iteration 2 00:28:39.757 05:18:59 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@41 -- # seek=2048 00:28:39.757 05:18:59 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@43 -- # echo 'Calculate MD5 checksum, iteration 2' 00:28:39.757 05:18:59 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@44 -- # tcp_dd --ib=ftln1 --of=/home/vagrant/spdk_repo/spdk/test/ftl/file --bs=1048576 --count=1024 --qd=2 --skip=1024 00:28:39.757 05:18:59 ftl.ftl_upgrade_shutdown -- ftl/common.sh@198 -- # tcp_initiator_setup 00:28:39.757 05:18:59 ftl.ftl_upgrade_shutdown -- ftl/common.sh@151 -- # local 'rpc=/home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk.tgt.sock' 00:28:39.757 05:18:59 ftl.ftl_upgrade_shutdown -- ftl/common.sh@153 -- # [[ -f /home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json ]] 00:28:39.757 05:18:59 ftl.ftl_upgrade_shutdown -- ftl/common.sh@154 -- # return 0 00:28:39.757 05:18:59 ftl.ftl_upgrade_shutdown -- ftl/common.sh@199 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd '--cpumask=[1]' --rpc-socket=/var/tmp/spdk.tgt.sock --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json --ib=ftln1 --of=/home/vagrant/spdk_repo/spdk/test/ftl/file --bs=1048576 --count=1024 --qd=2 --skip=1024 00:28:39.757 [2024-12-15 05:18:59.807761] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:28:39.757 [2024-12-15 05:18:59.807880] [ DPDK EAL parameters: spdk_dd --no-shconf -l 1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid95991 ] 00:28:40.018 [2024-12-15 05:18:59.961708] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:40.018 [2024-12-15 05:18:59.985637] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:28:41.404  [2024-12-15T05:19:02.117Z] Copying: 653/1024 [MB] (653 MBps) [2024-12-15T05:19:02.689Z] Copying: 1024/1024 [MB] (average 635 MBps) 00:28:42.549 00:28:42.549 05:19:02 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@45 -- # skip=2048 00:28:42.549 05:19:02 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@47 -- # md5sum /home/vagrant/spdk_repo/spdk/test/ftl/file 00:28:45.086 05:19:04 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@48 -- # cut -f1 '-d ' 00:28:45.086 05:19:04 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@48 -- # sums[i]=069a163c4aad509a297c87c187f56e27 00:28:45.086 05:19:04 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@38 -- # (( i++ )) 00:28:45.086 05:19:04 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@38 -- # (( i < iterations )) 00:28:45.086 05:19:04 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@52 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_set_property -b ftl -p verbose_mode -v true 00:28:45.086 [2024-12-15 05:19:04.955977] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:45.086 [2024-12-15 05:19:04.956024] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Decode property 00:28:45.086 [2024-12-15 05:19:04.956036] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.008 ms 00:28:45.086 [2024-12-15 05:19:04.956045] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:45.086 [2024-12-15 05:19:04.956064] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:45.086 [2024-12-15 05:19:04.956072] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Set property 00:28:45.086 [2024-12-15 05:19:04.956079] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.002 ms 00:28:45.086 [2024-12-15 05:19:04.956086] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:45.086 [2024-12-15 05:19:04.956101] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:45.086 [2024-12-15 05:19:04.956108] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Property setting cleanup 00:28:45.086 [2024-12-15 05:19:04.956117] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.001 ms 00:28:45.086 [2024-12-15 05:19:04.956123] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:45.086 [2024-12-15 05:19:04.956180] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl] Management process finished, name 'Set FTL property', duration = 0.191 ms, result 0 00:28:45.086 true 00:28:45.086 05:19:04 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@53 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_get_properties -b ftl 00:28:45.086 { 00:28:45.086 "name": "ftl", 00:28:45.086 "properties": [ 00:28:45.086 { 00:28:45.086 "name": "superblock_version", 00:28:45.086 "value": 5, 00:28:45.086 "read-only": true 00:28:45.086 }, 00:28:45.086 { 00:28:45.086 "name": "base_device", 00:28:45.086 "bands": [ 00:28:45.086 { 00:28:45.086 "id": 0, 00:28:45.086 "state": "FREE", 00:28:45.086 "validity": 0.0 00:28:45.086 }, 00:28:45.086 { 00:28:45.086 "id": 1, 00:28:45.086 "state": "FREE", 00:28:45.086 "validity": 0.0 00:28:45.086 }, 00:28:45.086 { 00:28:45.086 "id": 2, 00:28:45.086 "state": "FREE", 00:28:45.086 "validity": 0.0 00:28:45.086 }, 00:28:45.086 { 00:28:45.086 "id": 3, 00:28:45.086 "state": "FREE", 00:28:45.086 "validity": 0.0 00:28:45.086 }, 00:28:45.086 { 00:28:45.086 "id": 4, 00:28:45.086 "state": "FREE", 00:28:45.086 "validity": 0.0 00:28:45.086 }, 00:28:45.086 { 00:28:45.086 "id": 5, 00:28:45.086 "state": "FREE", 00:28:45.086 "validity": 0.0 00:28:45.086 }, 00:28:45.086 { 00:28:45.086 "id": 6, 00:28:45.086 "state": "FREE", 00:28:45.086 "validity": 0.0 00:28:45.086 }, 00:28:45.086 { 00:28:45.086 "id": 7, 00:28:45.086 "state": "FREE", 00:28:45.086 "validity": 0.0 00:28:45.086 }, 00:28:45.086 { 00:28:45.086 "id": 8, 00:28:45.086 "state": "FREE", 00:28:45.086 "validity": 0.0 00:28:45.086 }, 00:28:45.086 { 00:28:45.086 "id": 9, 00:28:45.086 "state": "FREE", 00:28:45.086 "validity": 0.0 00:28:45.086 }, 00:28:45.086 { 00:28:45.086 "id": 10, 00:28:45.086 "state": "FREE", 00:28:45.086 "validity": 0.0 00:28:45.086 }, 00:28:45.086 { 00:28:45.086 "id": 11, 00:28:45.086 "state": "FREE", 00:28:45.086 "validity": 0.0 00:28:45.086 }, 00:28:45.087 { 00:28:45.087 "id": 12, 00:28:45.087 "state": "FREE", 00:28:45.087 "validity": 0.0 00:28:45.087 }, 00:28:45.087 { 00:28:45.087 "id": 13, 00:28:45.087 "state": "FREE", 00:28:45.087 "validity": 0.0 00:28:45.087 }, 00:28:45.087 { 00:28:45.087 "id": 14, 00:28:45.087 "state": "FREE", 00:28:45.087 "validity": 0.0 00:28:45.087 }, 00:28:45.087 { 00:28:45.087 "id": 15, 00:28:45.087 "state": "FREE", 00:28:45.087 "validity": 0.0 00:28:45.087 }, 00:28:45.087 { 00:28:45.087 "id": 16, 00:28:45.087 "state": "FREE", 00:28:45.087 "validity": 0.0 00:28:45.087 }, 00:28:45.087 { 00:28:45.087 "id": 17, 00:28:45.087 "state": "FREE", 00:28:45.087 "validity": 0.0 00:28:45.087 } 00:28:45.087 ], 00:28:45.087 "read-only": true 00:28:45.087 }, 00:28:45.087 { 00:28:45.087 "name": "cache_device", 00:28:45.087 "type": "bdev", 00:28:45.087 "chunks": [ 00:28:45.087 { 00:28:45.087 "id": 0, 00:28:45.087 "state": "INACTIVE", 00:28:45.087 "utilization": 0.0 00:28:45.087 }, 00:28:45.087 { 00:28:45.087 "id": 1, 00:28:45.087 "state": "CLOSED", 00:28:45.087 "utilization": 1.0 00:28:45.087 }, 00:28:45.087 { 00:28:45.087 "id": 2, 00:28:45.087 "state": "CLOSED", 00:28:45.087 "utilization": 1.0 00:28:45.087 }, 00:28:45.087 { 00:28:45.087 "id": 3, 00:28:45.087 "state": "OPEN", 00:28:45.087 "utilization": 0.001953125 00:28:45.087 }, 00:28:45.087 { 00:28:45.087 "id": 4, 00:28:45.087 "state": "OPEN", 00:28:45.087 "utilization": 0.0 00:28:45.087 } 00:28:45.087 ], 00:28:45.087 "read-only": true 00:28:45.087 }, 00:28:45.087 { 00:28:45.087 "name": "verbose_mode", 00:28:45.087 "value": true, 00:28:45.087 "unit": "", 00:28:45.087 "desc": "In verbose mode, user is able to get access to additional advanced FTL properties" 00:28:45.087 }, 00:28:45.087 { 00:28:45.087 "name": "prep_upgrade_on_shutdown", 00:28:45.087 "value": false, 00:28:45.087 "unit": "", 00:28:45.087 "desc": "During shutdown, FTL executes all actions which are needed for upgrade to a new version" 00:28:45.087 } 00:28:45.087 ] 00:28:45.087 } 00:28:45.087 05:19:05 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@56 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_set_property -b ftl -p prep_upgrade_on_shutdown -v true 00:28:45.345 [2024-12-15 05:19:05.360275] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:45.345 [2024-12-15 05:19:05.360304] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Decode property 00:28:45.345 [2024-12-15 05:19:05.360312] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.004 ms 00:28:45.345 [2024-12-15 05:19:05.360318] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:45.345 [2024-12-15 05:19:05.360335] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:45.345 [2024-12-15 05:19:05.360341] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Set property 00:28:45.345 [2024-12-15 05:19:05.360347] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.001 ms 00:28:45.345 [2024-12-15 05:19:05.360354] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:45.345 [2024-12-15 05:19:05.360368] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:45.345 [2024-12-15 05:19:05.360374] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Property setting cleanup 00:28:45.345 [2024-12-15 05:19:05.360381] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.001 ms 00:28:45.345 [2024-12-15 05:19:05.360386] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:45.345 [2024-12-15 05:19:05.360429] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl] Management process finished, name 'Set FTL property', duration = 0.148 ms, result 0 00:28:45.345 true 00:28:45.345 05:19:05 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@63 -- # ftl_get_properties 00:28:45.345 05:19:05 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@59 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_get_properties -b ftl 00:28:45.345 05:19:05 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@63 -- # jq '[.properties[] | select(.name == "cache_device") | .chunks[] | select(.utilization != 0.0)] | length' 00:28:45.602 05:19:05 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@63 -- # used=3 00:28:45.602 05:19:05 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@64 -- # [[ 3 -eq 0 ]] 00:28:45.602 05:19:05 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@70 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_set_property -b ftl -p verbose_mode -v true 00:28:45.860 [2024-12-15 05:19:05.776631] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:45.860 [2024-12-15 05:19:05.776664] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Decode property 00:28:45.860 [2024-12-15 05:19:05.776673] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.003 ms 00:28:45.860 [2024-12-15 05:19:05.776679] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:45.860 [2024-12-15 05:19:05.776696] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:45.860 [2024-12-15 05:19:05.776703] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Set property 00:28:45.860 [2024-12-15 05:19:05.776709] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.001 ms 00:28:45.860 [2024-12-15 05:19:05.776715] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:45.860 [2024-12-15 05:19:05.776729] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:45.860 [2024-12-15 05:19:05.776736] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Property setting cleanup 00:28:45.860 [2024-12-15 05:19:05.776743] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.001 ms 00:28:45.860 [2024-12-15 05:19:05.776748] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:45.860 [2024-12-15 05:19:05.776789] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl] Management process finished, name 'Set FTL property', duration = 0.146 ms, result 0 00:28:45.860 true 00:28:45.860 05:19:05 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@71 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_get_properties -b ftl 00:28:45.860 { 00:28:45.860 "name": "ftl", 00:28:45.860 "properties": [ 00:28:45.860 { 00:28:45.860 "name": "superblock_version", 00:28:45.860 "value": 5, 00:28:45.860 "read-only": true 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "name": "base_device", 00:28:45.860 "bands": [ 00:28:45.860 { 00:28:45.860 "id": 0, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "id": 1, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "id": 2, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "id": 3, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "id": 4, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "id": 5, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "id": 6, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "id": 7, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "id": 8, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "id": 9, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "id": 10, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "id": 11, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "id": 12, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "id": 13, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.860 { 00:28:45.860 "id": 14, 00:28:45.860 "state": "FREE", 00:28:45.860 "validity": 0.0 00:28:45.860 }, 00:28:45.861 { 00:28:45.861 "id": 15, 00:28:45.861 "state": "FREE", 00:28:45.861 "validity": 0.0 00:28:45.861 }, 00:28:45.861 { 00:28:45.861 "id": 16, 00:28:45.861 "state": "FREE", 00:28:45.861 "validity": 0.0 00:28:45.861 }, 00:28:45.861 { 00:28:45.861 "id": 17, 00:28:45.861 "state": "FREE", 00:28:45.861 "validity": 0.0 00:28:45.861 } 00:28:45.861 ], 00:28:45.861 "read-only": true 00:28:45.861 }, 00:28:45.861 { 00:28:45.861 "name": "cache_device", 00:28:45.861 "type": "bdev", 00:28:45.861 "chunks": [ 00:28:45.861 { 00:28:45.861 "id": 0, 00:28:45.861 "state": "INACTIVE", 00:28:45.861 "utilization": 0.0 00:28:45.861 }, 00:28:45.861 { 00:28:45.861 "id": 1, 00:28:45.861 "state": "CLOSED", 00:28:45.861 "utilization": 1.0 00:28:45.861 }, 00:28:45.861 { 00:28:45.861 "id": 2, 00:28:45.861 "state": "CLOSED", 00:28:45.861 "utilization": 1.0 00:28:45.861 }, 00:28:45.861 { 00:28:45.861 "id": 3, 00:28:45.861 "state": "OPEN", 00:28:45.861 "utilization": 0.001953125 00:28:45.861 }, 00:28:45.861 { 00:28:45.861 "id": 4, 00:28:45.861 "state": "OPEN", 00:28:45.861 "utilization": 0.0 00:28:45.861 } 00:28:45.861 ], 00:28:45.861 "read-only": true 00:28:45.861 }, 00:28:45.861 { 00:28:45.861 "name": "verbose_mode", 00:28:45.861 "value": true, 00:28:45.861 "unit": "", 00:28:45.861 "desc": "In verbose mode, user is able to get access to additional advanced FTL properties" 00:28:45.861 }, 00:28:45.861 { 00:28:45.861 "name": "prep_upgrade_on_shutdown", 00:28:45.861 "value": true, 00:28:45.861 "unit": "", 00:28:45.861 "desc": "During shutdown, FTL executes all actions which are needed for upgrade to a new version" 00:28:45.861 } 00:28:45.861 ] 00:28:45.861 } 00:28:46.119 05:19:06 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@74 -- # tcp_target_shutdown 00:28:46.119 05:19:06 ftl.ftl_upgrade_shutdown -- ftl/common.sh@130 -- # [[ -n 95690 ]] 00:28:46.119 05:19:06 ftl.ftl_upgrade_shutdown -- ftl/common.sh@131 -- # killprocess 95690 00:28:46.119 05:19:06 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@954 -- # '[' -z 95690 ']' 00:28:46.119 05:19:06 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@958 -- # kill -0 95690 00:28:46.119 05:19:06 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@959 -- # uname 00:28:46.119 05:19:06 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:28:46.119 05:19:06 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 95690 00:28:46.119 killing process with pid 95690 00:28:46.119 05:19:06 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:28:46.119 05:19:06 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:28:46.119 05:19:06 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@972 -- # echo 'killing process with pid 95690' 00:28:46.119 05:19:06 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@973 -- # kill 95690 00:28:46.119 05:19:06 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@978 -- # wait 95690 00:28:46.119 [2024-12-15 05:19:06.140641] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl] FTL IO channel destroy on nvmf_tgt_poll_group_000 00:28:46.119 [2024-12-15 05:19:06.144761] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:46.119 [2024-12-15 05:19:06.144795] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Deinit core IO channel 00:28:46.119 [2024-12-15 05:19:06.144807] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.003 ms 00:28:46.119 [2024-12-15 05:19:06.144813] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:46.119 [2024-12-15 05:19:06.144836] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl] FTL IO channel destroy on app_thread 00:28:46.119 [2024-12-15 05:19:06.145347] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:46.119 [2024-12-15 05:19:06.145371] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Unregister IO device 00:28:46.119 [2024-12-15 05:19:06.145379] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.499 ms 00:28:46.119 [2024-12-15 05:19:06.145385] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.115 [2024-12-15 05:19:14.753464] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:56.115 [2024-12-15 05:19:14.753716] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Stop core poller 00:28:56.115 [2024-12-15 05:19:14.753735] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 8608.025 ms 00:28:56.115 [2024-12-15 05:19:14.753743] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.115 [2024-12-15 05:19:14.754896] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:56.115 [2024-12-15 05:19:14.754913] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Persist L2P 00:28:56.115 [2024-12-15 05:19:14.754921] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1.137 ms 00:28:56.115 [2024-12-15 05:19:14.754928] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.115 [2024-12-15 05:19:14.755894] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:56.115 [2024-12-15 05:19:14.755923] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Finish L2P trims 00:28:56.115 [2024-12-15 05:19:14.755937] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.841 ms 00:28:56.115 [2024-12-15 05:19:14.755943] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.115 [2024-12-15 05:19:14.758619] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:56.115 [2024-12-15 05:19:14.758649] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Persist NV cache metadata 00:28:56.115 [2024-12-15 05:19:14.758658] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 2.635 ms 00:28:56.115 [2024-12-15 05:19:14.758665] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.115 [2024-12-15 05:19:14.761780] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:56.115 [2024-12-15 05:19:14.761811] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Persist valid map metadata 00:28:56.115 [2024-12-15 05:19:14.761820] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 3.086 ms 00:28:56.115 [2024-12-15 05:19:14.761831] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.115 [2024-12-15 05:19:14.761891] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:56.115 [2024-12-15 05:19:14.761899] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Persist P2L metadata 00:28:56.115 [2024-12-15 05:19:14.761915] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.032 ms 00:28:56.115 [2024-12-15 05:19:14.761921] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.115 [2024-12-15 05:19:14.764184] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:56.115 [2024-12-15 05:19:14.764225] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Persist band info metadata 00:28:56.115 [2024-12-15 05:19:14.764233] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 2.250 ms 00:28:56.115 [2024-12-15 05:19:14.764239] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.115 [2024-12-15 05:19:14.766259] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:56.115 [2024-12-15 05:19:14.766287] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Persist trim metadata 00:28:56.115 [2024-12-15 05:19:14.766295] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1.994 ms 00:28:56.115 [2024-12-15 05:19:14.766301] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.115 [2024-12-15 05:19:14.768204] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:56.115 [2024-12-15 05:19:14.768355] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Persist superblock 00:28:56.115 [2024-12-15 05:19:14.768367] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1.878 ms 00:28:56.115 [2024-12-15 05:19:14.768374] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.115 [2024-12-15 05:19:14.769655] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:56.115 [2024-12-15 05:19:14.769677] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Set FTL clean state 00:28:56.115 [2024-12-15 05:19:14.769684] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1.230 ms 00:28:56.115 [2024-12-15 05:19:14.769690] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.115 [2024-12-15 05:19:14.769715] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Bands validity: 00:28:56.116 [2024-12-15 05:19:14.769727] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 1: 261120 / 261120 wr_cnt: 1 state: closed 00:28:56.116 [2024-12-15 05:19:14.769737] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 2: 261120 / 261120 wr_cnt: 1 state: closed 00:28:56.116 [2024-12-15 05:19:14.769743] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 3: 2048 / 261120 wr_cnt: 1 state: closed 00:28:56.116 [2024-12-15 05:19:14.769750] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769757] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769763] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769769] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769775] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769781] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769787] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769794] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769801] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769806] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769813] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769818] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769824] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769829] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769835] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:28:56.116 [2024-12-15 05:19:14.769843] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] 00:28:56.116 [2024-12-15 05:19:14.769848] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] device UUID: b0da1c54-e331-4f00-a60a-a11723dff425 00:28:56.116 [2024-12-15 05:19:14.769855] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] total valid LBAs: 524288 00:28:56.116 [2024-12-15 05:19:14.769865] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] total writes: 786752 00:28:56.116 [2024-12-15 05:19:14.769872] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] user writes: 524288 00:28:56.116 [2024-12-15 05:19:14.769878] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] WAF: 1.5006 00:28:56.116 [2024-12-15 05:19:14.769884] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] limits: 00:28:56.116 [2024-12-15 05:19:14.769891] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] crit: 0 00:28:56.116 [2024-12-15 05:19:14.769897] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] high: 0 00:28:56.116 [2024-12-15 05:19:14.769905] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] low: 0 00:28:56.116 [2024-12-15 05:19:14.769911] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] start: 0 00:28:56.116 [2024-12-15 05:19:14.769917] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:56.116 [2024-12-15 05:19:14.769923] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Dump statistics 00:28:56.116 [2024-12-15 05:19:14.769931] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.203 ms 00:28:56.116 [2024-12-15 05:19:14.769937] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.771760] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:56.116 [2024-12-15 05:19:14.771791] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Deinitialize L2P 00:28:56.116 [2024-12-15 05:19:14.771798] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1.811 ms 00:28:56.116 [2024-12-15 05:19:14.771805] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.771894] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:56.116 [2024-12-15 05:19:14.771901] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Deinitialize P2L checkpointing 00:28:56.116 [2024-12-15 05:19:14.771908] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.073 ms 00:28:56.116 [2024-12-15 05:19:14.771915] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.777887] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:28:56.116 [2024-12-15 05:19:14.777920] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize reloc 00:28:56.116 [2024-12-15 05:19:14.777930] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:28:56.116 [2024-12-15 05:19:14.777936] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.777965] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:28:56.116 [2024-12-15 05:19:14.777973] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize bands metadata 00:28:56.116 [2024-12-15 05:19:14.777979] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:28:56.116 [2024-12-15 05:19:14.777986] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.778052] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:28:56.116 [2024-12-15 05:19:14.778062] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize trim map 00:28:56.116 [2024-12-15 05:19:14.778073] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:28:56.116 [2024-12-15 05:19:14.778080] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.778095] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:28:56.116 [2024-12-15 05:19:14.778104] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize valid map 00:28:56.116 [2024-12-15 05:19:14.778111] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:28:56.116 [2024-12-15 05:19:14.778117] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.789604] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:28:56.116 [2024-12-15 05:19:14.789773] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize NV cache 00:28:56.116 [2024-12-15 05:19:14.789787] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:28:56.116 [2024-12-15 05:19:14.789794] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.798385] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:28:56.116 [2024-12-15 05:19:14.798423] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize metadata 00:28:56.116 [2024-12-15 05:19:14.798490] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:28:56.116 [2024-12-15 05:19:14.798498] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.798566] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:28:56.116 [2024-12-15 05:19:14.798580] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize core IO channel 00:28:56.116 [2024-12-15 05:19:14.798589] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:28:56.116 [2024-12-15 05:19:14.798595] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.798623] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:28:56.116 [2024-12-15 05:19:14.798631] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize bands 00:28:56.116 [2024-12-15 05:19:14.798637] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:28:56.116 [2024-12-15 05:19:14.798649] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.798709] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:28:56.116 [2024-12-15 05:19:14.798717] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize memory pools 00:28:56.116 [2024-12-15 05:19:14.798726] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:28:56.116 [2024-12-15 05:19:14.798733] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.798758] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:28:56.116 [2024-12-15 05:19:14.798769] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize superblock 00:28:56.116 [2024-12-15 05:19:14.798776] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:28:56.116 [2024-12-15 05:19:14.798785] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.798822] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:28:56.116 [2024-12-15 05:19:14.798829] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Open cache bdev 00:28:56.116 [2024-12-15 05:19:14.798839] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:28:56.116 [2024-12-15 05:19:14.798846] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.798886] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:28:56.116 [2024-12-15 05:19:14.798895] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Open base bdev 00:28:56.116 [2024-12-15 05:19:14.798901] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:28:56.116 [2024-12-15 05:19:14.798908] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:56.116 [2024-12-15 05:19:14.799039] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl] Management process finished, name 'FTL shutdown', duration = 8654.221 ms, result 0 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- ftl/common.sh@132 -- # unset spdk_tgt_pid 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@75 -- # tcp_target_setup 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- ftl/common.sh@81 -- # local base_bdev= 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- ftl/common.sh@82 -- # local cache_bdev= 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- ftl/common.sh@84 -- # [[ -f /home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json ]] 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- ftl/common.sh@89 -- # spdk_tgt_pid=96182 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- ftl/common.sh@85 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt '--cpumask=[0]' --config=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- ftl/common.sh@90 -- # export spdk_tgt_pid 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- ftl/common.sh@91 -- # waitforlisten 96182 00:28:58.665 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@835 -- # '[' -z 96182 ']' 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@840 -- # local max_retries=100 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@844 -- # xtrace_disable 00:28:58.665 05:19:18 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@10 -- # set +x 00:28:58.926 [2024-12-15 05:19:18.812506] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:28:58.926 [2024-12-15 05:19:18.813194] [ DPDK EAL parameters: spdk_tgt --no-shconf -l 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid96182 ] 00:28:58.926 [2024-12-15 05:19:18.975746] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:58.926 [2024-12-15 05:19:19.005600] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:28:59.501 [2024-12-15 05:19:19.331781] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: cachen1 00:28:59.501 [2024-12-15 05:19:19.331869] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: cachen1 00:28:59.501 [2024-12-15 05:19:19.485284] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:59.501 [2024-12-15 05:19:19.485535] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Check configuration 00:28:59.501 [2024-12-15 05:19:19.485569] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.005 ms 00:28:59.501 [2024-12-15 05:19:19.485578] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:59.501 [2024-12-15 05:19:19.485663] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:59.501 [2024-12-15 05:19:19.485675] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Open base bdev 00:28:59.501 [2024-12-15 05:19:19.485687] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.052 ms 00:28:59.501 [2024-12-15 05:19:19.485695] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:59.501 [2024-12-15 05:19:19.485725] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl] Using cachen1p0 as write buffer cache 00:28:59.501 [2024-12-15 05:19:19.486001] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl] Using bdev as NV Cache device 00:28:59.501 [2024-12-15 05:19:19.486018] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:59.501 [2024-12-15 05:19:19.486027] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Open cache bdev 00:28:59.501 [2024-12-15 05:19:19.486036] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.304 ms 00:28:59.501 [2024-12-15 05:19:19.486043] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:59.501 [2024-12-15 05:19:19.487797] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl] SHM: clean 0, shm_clean 0 00:28:59.501 [2024-12-15 05:19:19.491960] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:59.501 [2024-12-15 05:19:19.492024] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Load super block 00:28:59.501 [2024-12-15 05:19:19.492036] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 4.164 ms 00:28:59.501 [2024-12-15 05:19:19.492044] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:59.501 [2024-12-15 05:19:19.492132] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:59.501 [2024-12-15 05:19:19.492142] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Validate super block 00:28:59.501 [2024-12-15 05:19:19.492152] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.029 ms 00:28:59.501 [2024-12-15 05:19:19.492159] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:59.501 [2024-12-15 05:19:19.500775] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:59.501 [2024-12-15 05:19:19.500820] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize memory pools 00:28:59.501 [2024-12-15 05:19:19.500831] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 8.566 ms 00:28:59.501 [2024-12-15 05:19:19.500845] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:59.501 [2024-12-15 05:19:19.500903] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:59.501 [2024-12-15 05:19:19.500913] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize bands 00:28:59.501 [2024-12-15 05:19:19.500921] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.021 ms 00:28:59.501 [2024-12-15 05:19:19.500929] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:59.501 [2024-12-15 05:19:19.500996] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:59.501 [2024-12-15 05:19:19.501007] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Register IO device 00:28:59.501 [2024-12-15 05:19:19.501016] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.007 ms 00:28:59.501 [2024-12-15 05:19:19.501026] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:59.501 [2024-12-15 05:19:19.501050] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl] FTL IO channel created on app_thread 00:28:59.501 [2024-12-15 05:19:19.503130] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:59.501 [2024-12-15 05:19:19.503170] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize core IO channel 00:28:59.501 [2024-12-15 05:19:19.503181] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 2.085 ms 00:28:59.501 [2024-12-15 05:19:19.503189] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:59.501 [2024-12-15 05:19:19.503226] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:59.501 [2024-12-15 05:19:19.503234] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Decorate bands 00:28:59.501 [2024-12-15 05:19:19.503243] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.006 ms 00:28:59.501 [2024-12-15 05:19:19.503250] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:59.501 [2024-12-15 05:19:19.503275] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl] FTL layout setup mode 0 00:28:59.501 [2024-12-15 05:19:19.503298] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl] nvc layout blob load 0x150 bytes 00:28:59.501 [2024-12-15 05:19:19.503334] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl] base layout blob load 0x48 bytes 00:28:59.501 [2024-12-15 05:19:19.503360] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl] layout blob load 0x190 bytes 00:28:59.501 [2024-12-15 05:19:19.503494] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl] nvc layout blob store 0x150 bytes 00:28:59.501 [2024-12-15 05:19:19.503507] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl] base layout blob store 0x48 bytes 00:28:59.501 [2024-12-15 05:19:19.503519] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl] layout blob store 0x190 bytes 00:28:59.501 [2024-12-15 05:19:19.503529] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl] Base device capacity: 20480.00 MiB 00:28:59.501 [2024-12-15 05:19:19.503538] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl] NV cache device capacity: 5120.00 MiB 00:28:59.501 [2024-12-15 05:19:19.503547] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl] L2P entries: 3774873 00:28:59.501 [2024-12-15 05:19:19.503554] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl] L2P address size: 4 00:28:59.501 [2024-12-15 05:19:19.503561] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl] P2L checkpoint pages: 2048 00:28:59.501 [2024-12-15 05:19:19.503569] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl] NV cache chunk count 5 00:28:59.501 [2024-12-15 05:19:19.503599] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:59.501 [2024-12-15 05:19:19.503610] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize layout 00:28:59.501 [2024-12-15 05:19:19.503618] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.327 ms 00:28:59.501 [2024-12-15 05:19:19.503626] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:59.501 [2024-12-15 05:19:19.503714] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:59.501 [2024-12-15 05:19:19.503723] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Verify layout 00:28:59.501 [2024-12-15 05:19:19.503731] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.070 ms 00:28:59.501 [2024-12-15 05:19:19.503741] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:59.501 [2024-12-15 05:19:19.503847] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl] NV cache layout: 00:28:59.501 [2024-12-15 05:19:19.503859] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region sb 00:28:59.501 [2024-12-15 05:19:19.503874] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 0.00 MiB 00:28:59.501 [2024-12-15 05:19:19.503884] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:59.501 [2024-12-15 05:19:19.503893] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region l2p 00:28:59.501 [2024-12-15 05:19:19.503901] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 0.12 MiB 00:28:59.501 [2024-12-15 05:19:19.503909] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 14.50 MiB 00:28:59.501 [2024-12-15 05:19:19.503916] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region band_md 00:28:59.501 [2024-12-15 05:19:19.503925] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 14.62 MiB 00:28:59.501 [2024-12-15 05:19:19.503932] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:59.501 [2024-12-15 05:19:19.503940] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region band_md_mirror 00:28:59.501 [2024-12-15 05:19:19.503947] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 14.75 MiB 00:28:59.501 [2024-12-15 05:19:19.503955] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:59.501 [2024-12-15 05:19:19.503969] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region nvc_md 00:28:59.501 [2024-12-15 05:19:19.503977] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.38 MiB 00:28:59.502 [2024-12-15 05:19:19.503988] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:59.502 [2024-12-15 05:19:19.503997] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region nvc_md_mirror 00:28:59.502 [2024-12-15 05:19:19.504005] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.50 MiB 00:28:59.502 [2024-12-15 05:19:19.504013] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:59.502 [2024-12-15 05:19:19.504023] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region p2l0 00:28:59.502 [2024-12-15 05:19:19.504031] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 14.88 MiB 00:28:59.502 [2024-12-15 05:19:19.504039] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 8.00 MiB 00:28:59.502 [2024-12-15 05:19:19.504046] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region p2l1 00:28:59.502 [2024-12-15 05:19:19.504054] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 22.88 MiB 00:28:59.502 [2024-12-15 05:19:19.504061] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 8.00 MiB 00:28:59.502 [2024-12-15 05:19:19.504069] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region p2l2 00:28:59.502 [2024-12-15 05:19:19.504078] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 30.88 MiB 00:28:59.502 [2024-12-15 05:19:19.504086] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 8.00 MiB 00:28:59.502 [2024-12-15 05:19:19.504094] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region p2l3 00:28:59.502 [2024-12-15 05:19:19.504101] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 38.88 MiB 00:28:59.502 [2024-12-15 05:19:19.504109] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 8.00 MiB 00:28:59.502 [2024-12-15 05:19:19.504122] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region trim_md 00:28:59.502 [2024-12-15 05:19:19.504131] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 46.88 MiB 00:28:59.502 [2024-12-15 05:19:19.504139] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:59.502 [2024-12-15 05:19:19.504146] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region trim_md_mirror 00:28:59.502 [2024-12-15 05:19:19.504154] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.00 MiB 00:28:59.502 [2024-12-15 05:19:19.504161] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:59.502 [2024-12-15 05:19:19.504169] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region trim_log 00:28:59.502 [2024-12-15 05:19:19.504177] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.12 MiB 00:28:59.502 [2024-12-15 05:19:19.504185] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:59.502 [2024-12-15 05:19:19.504193] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region trim_log_mirror 00:28:59.502 [2024-12-15 05:19:19.504200] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.25 MiB 00:28:59.502 [2024-12-15 05:19:19.504208] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:59.502 [2024-12-15 05:19:19.504227] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl] Base device layout: 00:28:59.502 [2024-12-15 05:19:19.504236] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region sb_mirror 00:28:59.502 [2024-12-15 05:19:19.504243] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 0.00 MiB 00:28:59.502 [2024-12-15 05:19:19.504255] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:28:59.502 [2024-12-15 05:19:19.504266] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region vmap 00:28:59.502 [2024-12-15 05:19:19.504273] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 18432.25 MiB 00:28:59.502 [2024-12-15 05:19:19.504279] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.88 MiB 00:28:59.502 [2024-12-15 05:19:19.504286] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region data_btm 00:28:59.502 [2024-12-15 05:19:19.504296] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 0.25 MiB 00:28:59.502 [2024-12-15 05:19:19.504303] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 18432.00 MiB 00:28:59.502 [2024-12-15 05:19:19.504313] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] SB metadata layout - nvc: 00:28:59.502 [2024-12-15 05:19:19.504323] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:28:59.502 [2024-12-15 05:19:19.504336] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0xe80 00:28:59.502 [2024-12-15 05:19:19.504344] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x3 ver:2 blk_offs:0xea0 blk_sz:0x20 00:28:59.502 [2024-12-15 05:19:19.504352] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x4 ver:2 blk_offs:0xec0 blk_sz:0x20 00:28:59.502 [2024-12-15 05:19:19.504360] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xa ver:2 blk_offs:0xee0 blk_sz:0x800 00:28:59.502 [2024-12-15 05:19:19.504369] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xb ver:2 blk_offs:0x16e0 blk_sz:0x800 00:28:59.502 [2024-12-15 05:19:19.504376] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xc ver:2 blk_offs:0x1ee0 blk_sz:0x800 00:28:59.502 [2024-12-15 05:19:19.504384] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xd ver:2 blk_offs:0x26e0 blk_sz:0x800 00:28:59.502 [2024-12-15 05:19:19.504392] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xe ver:0 blk_offs:0x2ee0 blk_sz:0x20 00:28:59.502 [2024-12-15 05:19:19.504403] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xf ver:0 blk_offs:0x2f00 blk_sz:0x20 00:28:59.502 [2024-12-15 05:19:19.504411] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x10 ver:1 blk_offs:0x2f20 blk_sz:0x20 00:28:59.502 [2024-12-15 05:19:19.504418] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x11 ver:1 blk_offs:0x2f40 blk_sz:0x20 00:28:59.502 [2024-12-15 05:19:19.504425] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x6 ver:2 blk_offs:0x2f60 blk_sz:0x20 00:28:59.502 [2024-12-15 05:19:19.504433] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x7 ver:2 blk_offs:0x2f80 blk_sz:0x20 00:28:59.502 [2024-12-15 05:19:19.504455] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xfffffffe ver:0 blk_offs:0x2fa0 blk_sz:0x13d060 00:28:59.502 [2024-12-15 05:19:19.504462] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] SB metadata layout - base dev: 00:28:59.502 [2024-12-15 05:19:19.504471] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:28:59.502 [2024-12-15 05:19:19.504482] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:28:59.502 [2024-12-15 05:19:19.504490] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x480000 00:28:59.502 [2024-12-15 05:19:19.504497] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x5 ver:0 blk_offs:0x480040 blk_sz:0xe0 00:28:59.502 [2024-12-15 05:19:19.504504] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xfffffffe ver:0 blk_offs:0x480120 blk_sz:0x7fee0 00:28:59.502 [2024-12-15 05:19:19.504512] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:28:59.502 [2024-12-15 05:19:19.504520] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Layout upgrade 00:28:59.502 [2024-12-15 05:19:19.504527] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.734 ms 00:28:59.502 [2024-12-15 05:19:19.504535] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:28:59.502 [2024-12-15 05:19:19.504579] mngt/ftl_mngt_misc.c: 165:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl] NV cache data region needs scrubbing, this may take a while. 00:28:59.502 [2024-12-15 05:19:19.504589] mngt/ftl_mngt_misc.c: 166:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl] Scrubbing 5 chunks 00:29:03.781 [2024-12-15 05:19:23.431194] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.431286] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Scrub NV cache 00:29:03.781 [2024-12-15 05:19:23.431308] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 3926.602 ms 00:29:03.781 [2024-12-15 05:19:23.431317] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.781 [2024-12-15 05:19:23.444960] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.445193] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize metadata 00:29:03.781 [2024-12-15 05:19:23.445216] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 13.526 ms 00:29:03.781 [2024-12-15 05:19:23.445225] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.781 [2024-12-15 05:19:23.445306] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.445319] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize band addresses 00:29:03.781 [2024-12-15 05:19:23.445328] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.017 ms 00:29:03.781 [2024-12-15 05:19:23.445344] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.781 [2024-12-15 05:19:23.458290] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.458342] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize NV cache 00:29:03.781 [2024-12-15 05:19:23.458355] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 12.894 ms 00:29:03.781 [2024-12-15 05:19:23.458364] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.781 [2024-12-15 05:19:23.458404] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.458414] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize valid map 00:29:03.781 [2024-12-15 05:19:23.458428] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.004 ms 00:29:03.781 [2024-12-15 05:19:23.458462] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.781 [2024-12-15 05:19:23.459038] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.459082] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize trim map 00:29:03.781 [2024-12-15 05:19:23.459095] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.521 ms 00:29:03.781 [2024-12-15 05:19:23.459104] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.781 [2024-12-15 05:19:23.459168] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.459179] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize bands metadata 00:29:03.781 [2024-12-15 05:19:23.459189] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.032 ms 00:29:03.781 [2024-12-15 05:19:23.459203] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.781 [2024-12-15 05:19:23.467838] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.467880] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize reloc 00:29:03.781 [2024-12-15 05:19:23.467891] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 8.613 ms 00:29:03.781 [2024-12-15 05:19:23.467900] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.781 [2024-12-15 05:19:23.476876] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl] FTL NV Cache: full chunks = 0, empty chunks = 4 00:29:03.781 [2024-12-15 05:19:23.477086] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl] FTL NV Cache: state loaded successfully 00:29:03.781 [2024-12-15 05:19:23.477108] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.477118] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Restore NV cache metadata 00:29:03.781 [2024-12-15 05:19:23.477129] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 9.108 ms 00:29:03.781 [2024-12-15 05:19:23.477137] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.781 [2024-12-15 05:19:23.482248] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.482298] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Restore valid map metadata 00:29:03.781 [2024-12-15 05:19:23.482320] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 4.976 ms 00:29:03.781 [2024-12-15 05:19:23.482328] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.781 [2024-12-15 05:19:23.484905] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.484950] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Restore band info metadata 00:29:03.781 [2024-12-15 05:19:23.484960] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 2.521 ms 00:29:03.781 [2024-12-15 05:19:23.484968] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.781 [2024-12-15 05:19:23.487387] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.487447] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Restore trim metadata 00:29:03.781 [2024-12-15 05:19:23.487458] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 2.372 ms 00:29:03.781 [2024-12-15 05:19:23.487465] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.781 [2024-12-15 05:19:23.487838] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.487860] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize P2L checkpointing 00:29:03.781 [2024-12-15 05:19:23.487871] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.277 ms 00:29:03.781 [2024-12-15 05:19:23.487883] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.781 [2024-12-15 05:19:23.513768] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.513829] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Restore P2L checkpoints 00:29:03.781 [2024-12-15 05:19:23.513842] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 25.864 ms 00:29:03.781 [2024-12-15 05:19:23.513851] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.781 [2024-12-15 05:19:23.522320] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 1 (of 2) MiB 00:29:03.781 [2024-12-15 05:19:23.523362] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.781 [2024-12-15 05:19:23.523405] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize L2P 00:29:03.782 [2024-12-15 05:19:23.523417] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 9.458 ms 00:29:03.782 [2024-12-15 05:19:23.523425] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.782 [2024-12-15 05:19:23.523524] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.782 [2024-12-15 05:19:23.523538] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Restore L2P 00:29:03.782 [2024-12-15 05:19:23.523553] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.015 ms 00:29:03.782 [2024-12-15 05:19:23.523562] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.782 [2024-12-15 05:19:23.523612] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.782 [2024-12-15 05:19:23.523626] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Finalize band initialization 00:29:03.782 [2024-12-15 05:19:23.523635] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.021 ms 00:29:03.782 [2024-12-15 05:19:23.523643] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.782 [2024-12-15 05:19:23.523667] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.782 [2024-12-15 05:19:23.523677] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Start core poller 00:29:03.782 [2024-12-15 05:19:23.523685] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.006 ms 00:29:03.782 [2024-12-15 05:19:23.523701] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.782 [2024-12-15 05:19:23.523739] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl] Self test skipped 00:29:03.782 [2024-12-15 05:19:23.523750] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.782 [2024-12-15 05:19:23.523759] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Self test on startup 00:29:03.782 [2024-12-15 05:19:23.523771] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.012 ms 00:29:03.782 [2024-12-15 05:19:23.523779] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.782 [2024-12-15 05:19:23.529394] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.782 [2024-12-15 05:19:23.529469] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Set FTL dirty state 00:29:03.782 [2024-12-15 05:19:23.529482] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 5.596 ms 00:29:03.782 [2024-12-15 05:19:23.529491] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.782 [2024-12-15 05:19:23.529573] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.782 [2024-12-15 05:19:23.529584] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Finalize initialization 00:29:03.782 [2024-12-15 05:19:23.529594] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.036 ms 00:29:03.782 [2024-12-15 05:19:23.529609] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.782 [2024-12-15 05:19:23.530861] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl] Management process finished, name 'FTL startup', duration = 4045.104 ms, result 0 00:29:03.782 [2024-12-15 05:19:23.543956] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:29:03.782 [2024-12-15 05:19:23.559941] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl] FTL IO channel created on nvmf_tgt_poll_group_000 00:29:03.782 [2024-12-15 05:19:23.568072] tcp.c:1099:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:29:03.782 05:19:23 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:29:03.782 05:19:23 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@868 -- # return 0 00:29:03.782 05:19:23 ftl.ftl_upgrade_shutdown -- ftl/common.sh@93 -- # [[ -f /home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json ]] 00:29:03.782 05:19:23 ftl.ftl_upgrade_shutdown -- ftl/common.sh@95 -- # return 0 00:29:03.782 05:19:23 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@78 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_set_property -b ftl -p verbose_mode -v true 00:29:03.782 [2024-12-15 05:19:23.808045] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.782 [2024-12-15 05:19:23.808158] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Decode property 00:29:03.782 [2024-12-15 05:19:23.808172] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.005 ms 00:29:03.782 [2024-12-15 05:19:23.808179] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.782 [2024-12-15 05:19:23.808200] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.782 [2024-12-15 05:19:23.808225] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Set property 00:29:03.782 [2024-12-15 05:19:23.808232] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.002 ms 00:29:03.782 [2024-12-15 05:19:23.808237] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.782 [2024-12-15 05:19:23.808252] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:03.782 [2024-12-15 05:19:23.808259] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Property setting cleanup 00:29:03.782 [2024-12-15 05:19:23.808268] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.001 ms 00:29:03.782 [2024-12-15 05:19:23.808274] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:03.782 [2024-12-15 05:19:23.808319] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl] Management process finished, name 'Set FTL property', duration = 0.263 ms, result 0 00:29:03.782 true 00:29:03.782 05:19:23 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@79 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_get_properties -b ftl 00:29:04.043 { 00:29:04.043 "name": "ftl", 00:29:04.043 "properties": [ 00:29:04.043 { 00:29:04.043 "name": "superblock_version", 00:29:04.043 "value": 5, 00:29:04.043 "read-only": true 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "name": "base_device", 00:29:04.043 "bands": [ 00:29:04.043 { 00:29:04.043 "id": 0, 00:29:04.043 "state": "CLOSED", 00:29:04.043 "validity": 1.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 1, 00:29:04.043 "state": "CLOSED", 00:29:04.043 "validity": 1.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 2, 00:29:04.043 "state": "CLOSED", 00:29:04.043 "validity": 0.007843137254901933 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 3, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 4, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 5, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 6, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 7, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 8, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 9, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 10, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 11, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 12, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 13, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 14, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 15, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 16, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 17, 00:29:04.043 "state": "FREE", 00:29:04.043 "validity": 0.0 00:29:04.043 } 00:29:04.043 ], 00:29:04.043 "read-only": true 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "name": "cache_device", 00:29:04.043 "type": "bdev", 00:29:04.043 "chunks": [ 00:29:04.043 { 00:29:04.043 "id": 0, 00:29:04.043 "state": "INACTIVE", 00:29:04.043 "utilization": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 1, 00:29:04.043 "state": "OPEN", 00:29:04.043 "utilization": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 2, 00:29:04.043 "state": "OPEN", 00:29:04.043 "utilization": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 3, 00:29:04.043 "state": "FREE", 00:29:04.043 "utilization": 0.0 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "id": 4, 00:29:04.043 "state": "FREE", 00:29:04.043 "utilization": 0.0 00:29:04.043 } 00:29:04.043 ], 00:29:04.043 "read-only": true 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "name": "verbose_mode", 00:29:04.043 "value": true, 00:29:04.043 "unit": "", 00:29:04.043 "desc": "In verbose mode, user is able to get access to additional advanced FTL properties" 00:29:04.043 }, 00:29:04.043 { 00:29:04.043 "name": "prep_upgrade_on_shutdown", 00:29:04.043 "value": false, 00:29:04.043 "unit": "", 00:29:04.043 "desc": "During shutdown, FTL executes all actions which are needed for upgrade to a new version" 00:29:04.043 } 00:29:04.043 ] 00:29:04.043 } 00:29:04.043 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@82 -- # ftl_get_properties 00:29:04.043 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@59 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_get_properties -b ftl 00:29:04.043 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@82 -- # jq '[.properties[] | select(.name == "cache_device") | .chunks[] | select(.utilization != 0.0)] | length' 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@82 -- # used=0 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@83 -- # [[ 0 -ne 0 ]] 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@89 -- # ftl_get_properties 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@59 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_get_properties -b ftl 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@89 -- # jq '[.properties[] | select(.name == "bands") | .bands[] | select(.state == "OPENED")] | length' 00:29:04.305 Validate MD5 checksum, iteration 1 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@89 -- # opened=0 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@90 -- # [[ 0 -ne 0 ]] 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@111 -- # test_validate_checksum 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@96 -- # skip=0 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@97 -- # (( i = 0 )) 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@97 -- # (( i < iterations )) 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@98 -- # echo 'Validate MD5 checksum, iteration 1' 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@99 -- # tcp_dd --ib=ftln1 --of=/home/vagrant/spdk_repo/spdk/test/ftl/file --bs=1048576 --count=1024 --qd=2 --skip=0 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/common.sh@198 -- # tcp_initiator_setup 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/common.sh@151 -- # local 'rpc=/home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk.tgt.sock' 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/common.sh@153 -- # [[ -f /home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json ]] 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/common.sh@154 -- # return 0 00:29:04.305 05:19:24 ftl.ftl_upgrade_shutdown -- ftl/common.sh@199 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd '--cpumask=[1]' --rpc-socket=/var/tmp/spdk.tgt.sock --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json --ib=ftln1 --of=/home/vagrant/spdk_repo/spdk/test/ftl/file --bs=1048576 --count=1024 --qd=2 --skip=0 00:29:04.566 [2024-12-15 05:19:24.495017] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:29:04.566 [2024-12-15 05:19:24.495625] [ DPDK EAL parameters: spdk_dd --no-shconf -l 1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid96251 ] 00:29:04.566 [2024-12-15 05:19:24.652640] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:04.566 [2024-12-15 05:19:24.677213] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:29:06.036  [2024-12-15T05:19:26.748Z] Copying: 684/1024 [MB] (684 MBps) [2024-12-15T05:19:27.690Z] Copying: 1024/1024 [MB] (average 622 MBps) 00:29:07.550 00:29:07.550 05:19:27 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@100 -- # skip=1024 00:29:07.550 05:19:27 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@102 -- # md5sum /home/vagrant/spdk_repo/spdk/test/ftl/file 00:29:10.094 05:19:29 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@103 -- # cut -f1 '-d ' 00:29:10.094 05:19:29 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@103 -- # sum=7277b8a181cfd328f6ae0145f52da3f0 00:29:10.094 05:19:29 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@105 -- # [[ 7277b8a181cfd328f6ae0145f52da3f0 != \7\2\7\7\b\8\a\1\8\1\c\f\d\3\2\8\f\6\a\e\0\1\4\5\f\5\2\d\a\3\f\0 ]] 00:29:10.095 05:19:29 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@97 -- # (( i++ )) 00:29:10.095 05:19:29 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@97 -- # (( i < iterations )) 00:29:10.095 05:19:29 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@98 -- # echo 'Validate MD5 checksum, iteration 2' 00:29:10.095 Validate MD5 checksum, iteration 2 00:29:10.095 05:19:29 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@99 -- # tcp_dd --ib=ftln1 --of=/home/vagrant/spdk_repo/spdk/test/ftl/file --bs=1048576 --count=1024 --qd=2 --skip=1024 00:29:10.095 05:19:29 ftl.ftl_upgrade_shutdown -- ftl/common.sh@198 -- # tcp_initiator_setup 00:29:10.095 05:19:29 ftl.ftl_upgrade_shutdown -- ftl/common.sh@151 -- # local 'rpc=/home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk.tgt.sock' 00:29:10.095 05:19:29 ftl.ftl_upgrade_shutdown -- ftl/common.sh@153 -- # [[ -f /home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json ]] 00:29:10.095 05:19:29 ftl.ftl_upgrade_shutdown -- ftl/common.sh@154 -- # return 0 00:29:10.095 05:19:29 ftl.ftl_upgrade_shutdown -- ftl/common.sh@199 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd '--cpumask=[1]' --rpc-socket=/var/tmp/spdk.tgt.sock --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json --ib=ftln1 --of=/home/vagrant/spdk_repo/spdk/test/ftl/file --bs=1048576 --count=1024 --qd=2 --skip=1024 00:29:10.095 [2024-12-15 05:19:29.716083] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:29:10.095 [2024-12-15 05:19:29.716216] [ DPDK EAL parameters: spdk_dd --no-shconf -l 1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid96313 ] 00:29:10.095 [2024-12-15 05:19:29.875194] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:10.095 [2024-12-15 05:19:29.893343] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:29:11.482  [2024-12-15T05:19:31.883Z] Copying: 703/1024 [MB] (703 MBps) [2024-12-15T05:19:33.797Z] Copying: 1024/1024 [MB] (average 640 MBps) 00:29:13.657 00:29:13.657 05:19:33 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@100 -- # skip=2048 00:29:13.657 05:19:33 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@102 -- # md5sum /home/vagrant/spdk_repo/spdk/test/ftl/file 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@103 -- # cut -f1 '-d ' 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@103 -- # sum=069a163c4aad509a297c87c187f56e27 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@105 -- # [[ 069a163c4aad509a297c87c187f56e27 != \0\6\9\a\1\6\3\c\4\a\a\d\5\0\9\a\2\9\7\c\8\7\c\1\8\7\f\5\6\e\2\7 ]] 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@97 -- # (( i++ )) 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@97 -- # (( i < iterations )) 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@114 -- # tcp_target_shutdown_dirty 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/common.sh@137 -- # [[ -n 96182 ]] 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/common.sh@138 -- # kill -9 96182 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/common.sh@139 -- # unset spdk_tgt_pid 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@115 -- # tcp_target_setup 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/common.sh@81 -- # local base_bdev= 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/common.sh@82 -- # local cache_bdev= 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/common.sh@84 -- # [[ -f /home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json ]] 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/common.sh@89 -- # spdk_tgt_pid=96376 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/common.sh@85 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt '--cpumask=[0]' --config=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/common.sh@90 -- # export spdk_tgt_pid 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- ftl/common.sh@91 -- # waitforlisten 96376 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@835 -- # '[' -z 96376 ']' 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@840 -- # local max_retries=100 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:15.571 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@844 -- # xtrace_disable 00:29:15.571 05:19:35 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@10 -- # set +x 00:29:15.571 [2024-12-15 05:19:35.475255] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:29:15.571 [2024-12-15 05:19:35.475511] [ DPDK EAL parameters: spdk_tgt --no-shconf -l 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid96376 ] 00:29:15.571 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 834: 96182 Killed $spdk_tgt_bin "--cpumask=$spdk_tgt_cpumask" --config="$spdk_tgt_cnfg" 00:29:15.571 [2024-12-15 05:19:35.626262] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:15.571 [2024-12-15 05:19:35.642712] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:29:15.831 [2024-12-15 05:19:35.895118] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: cachen1 00:29:15.831 [2024-12-15 05:19:35.895324] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: cachen1 00:29:16.093 [2024-12-15 05:19:36.032957] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.094 [2024-12-15 05:19:36.033072] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Check configuration 00:29:16.094 [2024-12-15 05:19:36.033127] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.003 ms 00:29:16.094 [2024-12-15 05:19:36.033145] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.094 [2024-12-15 05:19:36.033199] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.094 [2024-12-15 05:19:36.033219] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Open base bdev 00:29:16.094 [2024-12-15 05:19:36.033236] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.028 ms 00:29:16.094 [2024-12-15 05:19:36.033251] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.094 [2024-12-15 05:19:36.033280] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl] Using cachen1p0 as write buffer cache 00:29:16.094 [2024-12-15 05:19:36.033538] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl] Using bdev as NV Cache device 00:29:16.094 [2024-12-15 05:19:36.033575] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.094 [2024-12-15 05:19:36.033590] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Open cache bdev 00:29:16.094 [2024-12-15 05:19:36.033606] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.301 ms 00:29:16.094 [2024-12-15 05:19:36.033624] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.094 [2024-12-15 05:19:36.033872] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl] SHM: clean 0, shm_clean 0 00:29:16.094 [2024-12-15 05:19:36.037054] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.094 [2024-12-15 05:19:36.037162] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Load super block 00:29:16.094 [2024-12-15 05:19:36.037217] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 3.183 ms 00:29:16.094 [2024-12-15 05:19:36.037235] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.094 [2024-12-15 05:19:36.037990] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.094 [2024-12-15 05:19:36.038071] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Validate super block 00:29:16.094 [2024-12-15 05:19:36.038082] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.023 ms 00:29:16.094 [2024-12-15 05:19:36.038089] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.094 [2024-12-15 05:19:36.038297] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.094 [2024-12-15 05:19:36.038306] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize memory pools 00:29:16.094 [2024-12-15 05:19:36.038314] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.161 ms 00:29:16.094 [2024-12-15 05:19:36.038320] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.094 [2024-12-15 05:19:36.038345] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.094 [2024-12-15 05:19:36.038352] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize bands 00:29:16.094 [2024-12-15 05:19:36.038358] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.013 ms 00:29:16.094 [2024-12-15 05:19:36.038364] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.094 [2024-12-15 05:19:36.038383] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.094 [2024-12-15 05:19:36.038391] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Register IO device 00:29:16.094 [2024-12-15 05:19:36.038398] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.005 ms 00:29:16.094 [2024-12-15 05:19:36.038406] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.094 [2024-12-15 05:19:36.038421] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl] FTL IO channel created on app_thread 00:29:16.094 [2024-12-15 05:19:36.039122] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.094 [2024-12-15 05:19:36.039135] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize core IO channel 00:29:16.094 [2024-12-15 05:19:36.039142] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.704 ms 00:29:16.094 [2024-12-15 05:19:36.039148] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.094 [2024-12-15 05:19:36.039168] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.094 [2024-12-15 05:19:36.039176] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Decorate bands 00:29:16.094 [2024-12-15 05:19:36.039185] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.003 ms 00:29:16.094 [2024-12-15 05:19:36.039190] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.094 [2024-12-15 05:19:36.039206] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl] FTL layout setup mode 0 00:29:16.094 [2024-12-15 05:19:36.039222] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl] nvc layout blob load 0x150 bytes 00:29:16.094 [2024-12-15 05:19:36.039247] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl] base layout blob load 0x48 bytes 00:29:16.094 [2024-12-15 05:19:36.039261] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl] layout blob load 0x190 bytes 00:29:16.094 [2024-12-15 05:19:36.039347] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl] nvc layout blob store 0x150 bytes 00:29:16.094 [2024-12-15 05:19:36.039356] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl] base layout blob store 0x48 bytes 00:29:16.094 [2024-12-15 05:19:36.039367] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl] layout blob store 0x190 bytes 00:29:16.094 [2024-12-15 05:19:36.039374] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl] Base device capacity: 20480.00 MiB 00:29:16.094 [2024-12-15 05:19:36.039381] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl] NV cache device capacity: 5120.00 MiB 00:29:16.094 [2024-12-15 05:19:36.039388] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl] L2P entries: 3774873 00:29:16.094 [2024-12-15 05:19:36.039394] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl] L2P address size: 4 00:29:16.094 [2024-12-15 05:19:36.039400] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl] P2L checkpoint pages: 2048 00:29:16.094 [2024-12-15 05:19:36.039405] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl] NV cache chunk count 5 00:29:16.094 [2024-12-15 05:19:36.039412] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.094 [2024-12-15 05:19:36.039418] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize layout 00:29:16.094 [2024-12-15 05:19:36.039425] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.207 ms 00:29:16.094 [2024-12-15 05:19:36.039431] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.094 [2024-12-15 05:19:36.039508] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.094 [2024-12-15 05:19:36.039515] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Verify layout 00:29:16.094 [2024-12-15 05:19:36.039524] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.053 ms 00:29:16.094 [2024-12-15 05:19:36.039531] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.094 [2024-12-15 05:19:36.039608] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl] NV cache layout: 00:29:16.094 [2024-12-15 05:19:36.039616] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region sb 00:29:16.094 [2024-12-15 05:19:36.039623] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 0.00 MiB 00:29:16.094 [2024-12-15 05:19:36.039630] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:29:16.094 [2024-12-15 05:19:36.039637] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region l2p 00:29:16.094 [2024-12-15 05:19:36.039643] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 0.12 MiB 00:29:16.094 [2024-12-15 05:19:36.039648] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 14.50 MiB 00:29:16.094 [2024-12-15 05:19:36.039653] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region band_md 00:29:16.094 [2024-12-15 05:19:36.039659] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 14.62 MiB 00:29:16.094 [2024-12-15 05:19:36.039665] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:29:16.094 [2024-12-15 05:19:36.039670] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region band_md_mirror 00:29:16.094 [2024-12-15 05:19:36.039678] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 14.75 MiB 00:29:16.094 [2024-12-15 05:19:36.039683] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:29:16.094 [2024-12-15 05:19:36.039688] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region nvc_md 00:29:16.094 [2024-12-15 05:19:36.039693] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.38 MiB 00:29:16.094 [2024-12-15 05:19:36.039703] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:29:16.094 [2024-12-15 05:19:36.039708] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region nvc_md_mirror 00:29:16.094 [2024-12-15 05:19:36.039714] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.50 MiB 00:29:16.094 [2024-12-15 05:19:36.039719] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:29:16.094 [2024-12-15 05:19:36.039724] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region p2l0 00:29:16.094 [2024-12-15 05:19:36.039729] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 14.88 MiB 00:29:16.094 [2024-12-15 05:19:36.039734] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 8.00 MiB 00:29:16.094 [2024-12-15 05:19:36.039739] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region p2l1 00:29:16.094 [2024-12-15 05:19:36.039743] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 22.88 MiB 00:29:16.094 [2024-12-15 05:19:36.039748] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 8.00 MiB 00:29:16.094 [2024-12-15 05:19:36.039753] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region p2l2 00:29:16.094 [2024-12-15 05:19:36.039758] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 30.88 MiB 00:29:16.094 [2024-12-15 05:19:36.039762] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 8.00 MiB 00:29:16.094 [2024-12-15 05:19:36.039767] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region p2l3 00:29:16.094 [2024-12-15 05:19:36.039773] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 38.88 MiB 00:29:16.094 [2024-12-15 05:19:36.039779] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 8.00 MiB 00:29:16.094 [2024-12-15 05:19:36.039787] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region trim_md 00:29:16.094 [2024-12-15 05:19:36.039794] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 46.88 MiB 00:29:16.094 [2024-12-15 05:19:36.039799] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:29:16.094 [2024-12-15 05:19:36.039805] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region trim_md_mirror 00:29:16.094 [2024-12-15 05:19:36.039810] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.00 MiB 00:29:16.094 [2024-12-15 05:19:36.039817] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:29:16.094 [2024-12-15 05:19:36.039823] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region trim_log 00:29:16.094 [2024-12-15 05:19:36.039829] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.12 MiB 00:29:16.094 [2024-12-15 05:19:36.039835] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:29:16.094 [2024-12-15 05:19:36.039841] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region trim_log_mirror 00:29:16.094 [2024-12-15 05:19:36.039847] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 47.25 MiB 00:29:16.094 [2024-12-15 05:19:36.039853] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:29:16.094 [2024-12-15 05:19:36.039859] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl] Base device layout: 00:29:16.094 [2024-12-15 05:19:36.039866] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region sb_mirror 00:29:16.095 [2024-12-15 05:19:36.039872] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 0.00 MiB 00:29:16.095 [2024-12-15 05:19:36.039880] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.12 MiB 00:29:16.095 [2024-12-15 05:19:36.039888] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region vmap 00:29:16.095 [2024-12-15 05:19:36.039894] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 18432.25 MiB 00:29:16.095 [2024-12-15 05:19:36.039900] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 0.88 MiB 00:29:16.095 [2024-12-15 05:19:36.039906] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl] Region data_btm 00:29:16.095 [2024-12-15 05:19:36.039911] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl] offset: 0.25 MiB 00:29:16.095 [2024-12-15 05:19:36.039918] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl] blocks: 18432.00 MiB 00:29:16.095 [2024-12-15 05:19:36.039925] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] SB metadata layout - nvc: 00:29:16.095 [2024-12-15 05:19:36.039932] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:29:16.095 [2024-12-15 05:19:36.039939] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0xe80 00:29:16.095 [2024-12-15 05:19:36.039946] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x3 ver:2 blk_offs:0xea0 blk_sz:0x20 00:29:16.095 [2024-12-15 05:19:36.039952] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x4 ver:2 blk_offs:0xec0 blk_sz:0x20 00:29:16.095 [2024-12-15 05:19:36.039958] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xa ver:2 blk_offs:0xee0 blk_sz:0x800 00:29:16.095 [2024-12-15 05:19:36.039964] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xb ver:2 blk_offs:0x16e0 blk_sz:0x800 00:29:16.095 [2024-12-15 05:19:36.039970] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xc ver:2 blk_offs:0x1ee0 blk_sz:0x800 00:29:16.095 [2024-12-15 05:19:36.039976] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xd ver:2 blk_offs:0x26e0 blk_sz:0x800 00:29:16.095 [2024-12-15 05:19:36.039983] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xe ver:0 blk_offs:0x2ee0 blk_sz:0x20 00:29:16.095 [2024-12-15 05:19:36.039991] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xf ver:0 blk_offs:0x2f00 blk_sz:0x20 00:29:16.095 [2024-12-15 05:19:36.039997] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x10 ver:1 blk_offs:0x2f20 blk_sz:0x20 00:29:16.095 [2024-12-15 05:19:36.040003] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x11 ver:1 blk_offs:0x2f40 blk_sz:0x20 00:29:16.095 [2024-12-15 05:19:36.040009] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x6 ver:2 blk_offs:0x2f60 blk_sz:0x20 00:29:16.095 [2024-12-15 05:19:36.040015] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x7 ver:2 blk_offs:0x2f80 blk_sz:0x20 00:29:16.095 [2024-12-15 05:19:36.040022] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xfffffffe ver:0 blk_offs:0x2fa0 blk_sz:0x13d060 00:29:16.095 [2024-12-15 05:19:36.040029] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] SB metadata layout - base dev: 00:29:16.095 [2024-12-15 05:19:36.040036] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:29:16.095 [2024-12-15 05:19:36.040042] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:29:16.095 [2024-12-15 05:19:36.040049] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x480000 00:29:16.095 [2024-12-15 05:19:36.040056] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0x5 ver:0 blk_offs:0x480040 blk_sz:0xe0 00:29:16.095 [2024-12-15 05:19:36.040062] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl] Region type:0xfffffffe ver:0 blk_offs:0x480120 blk_sz:0x7fee0 00:29:16.095 [2024-12-15 05:19:36.040069] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.040076] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Layout upgrade 00:29:16.095 [2024-12-15 05:19:36.040083] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.516 ms 00:29:16.095 [2024-12-15 05:19:36.040092] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.046191] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.046271] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize metadata 00:29:16.095 [2024-12-15 05:19:36.046344] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 6.056 ms 00:29:16.095 [2024-12-15 05:19:36.046362] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.046400] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.046416] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize band addresses 00:29:16.095 [2024-12-15 05:19:36.046431] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.010 ms 00:29:16.095 [2024-12-15 05:19:36.046488] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.053921] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.054015] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize NV cache 00:29:16.095 [2024-12-15 05:19:36.054056] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 7.391 ms 00:29:16.095 [2024-12-15 05:19:36.054080] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.054113] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.054193] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize valid map 00:29:16.095 [2024-12-15 05:19:36.054215] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.002 ms 00:29:16.095 [2024-12-15 05:19:36.054232] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.054309] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.054338] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize trim map 00:29:16.095 [2024-12-15 05:19:36.054360] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.034 ms 00:29:16.095 [2024-12-15 05:19:36.054495] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.054545] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.054562] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize bands metadata 00:29:16.095 [2024-12-15 05:19:36.054668] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.018 ms 00:29:16.095 [2024-12-15 05:19:36.054688] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.059527] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.059865] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize reloc 00:29:16.095 [2024-12-15 05:19:36.060034] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 4.812 ms 00:29:16.095 [2024-12-15 05:19:36.060052] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.060205] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.060244] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize recovery 00:29:16.095 [2024-12-15 05:19:36.060263] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.001 ms 00:29:16.095 [2024-12-15 05:19:36.060635] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.078240] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.078402] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Recover band state 00:29:16.095 [2024-12-15 05:19:36.078500] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 17.571 ms 00:29:16.095 [2024-12-15 05:19:36.078535] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.079964] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.080085] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize P2L checkpointing 00:29:16.095 [2024-12-15 05:19:36.080163] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.353 ms 00:29:16.095 [2024-12-15 05:19:36.080257] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.095214] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.095319] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Restore P2L checkpoints 00:29:16.095 [2024-12-15 05:19:36.095360] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 14.883 ms 00:29:16.095 [2024-12-15 05:19:36.095381] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.095499] mngt/ftl_mngt_recovery.c: 596:p2l_ckpt_preprocess: *NOTICE*: [FTL][ftl] P2L ckpt_id=0 found seq_id=8 00:29:16.095 [2024-12-15 05:19:36.095682] mngt/ftl_mngt_recovery.c: 596:p2l_ckpt_preprocess: *NOTICE*: [FTL][ftl] P2L ckpt_id=1 found seq_id=9 00:29:16.095 [2024-12-15 05:19:36.095749] mngt/ftl_mngt_recovery.c: 596:p2l_ckpt_preprocess: *NOTICE*: [FTL][ftl] P2L ckpt_id=2 found seq_id=12 00:29:16.095 [2024-12-15 05:19:36.095819] mngt/ftl_mngt_recovery.c: 596:p2l_ckpt_preprocess: *NOTICE*: [FTL][ftl] P2L ckpt_id=3 found seq_id=0 00:29:16.095 [2024-12-15 05:19:36.095826] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.095833] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Preprocess P2L checkpoints 00:29:16.095 [2024-12-15 05:19:36.095843] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.410 ms 00:29:16.095 [2024-12-15 05:19:36.095850] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.095885] mngt/ftl_mngt_recovery.c: 650:ftl_mngt_recovery_open_bands_p2l: *NOTICE*: [FTL][ftl] No more open bands to recover from P2L 00:29:16.095 [2024-12-15 05:19:36.095894] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.095900] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Recover open bands P2L 00:29:16.095 [2024-12-15 05:19:36.095907] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.009 ms 00:29:16.095 [2024-12-15 05:19:36.095913] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.098108] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.098205] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Recover chunk state 00:29:16.095 [2024-12-15 05:19:36.098217] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 2.176 ms 00:29:16.095 [2024-12-15 05:19:36.098230] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.098727] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.098742] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Recover max seq ID 00:29:16.095 [2024-12-15 05:19:36.098750] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.007 ms 00:29:16.095 [2024-12-15 05:19:36.098756] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.095 [2024-12-15 05:19:36.098808] ftl_nv_cache.c:2274:recover_open_chunk_prepare: *NOTICE*: [FTL][ftl] Start recovery open chunk, offset = 262144, seq id 14 00:29:16.095 [2024-12-15 05:19:36.098927] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.095 [2024-12-15 05:19:36.098936] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Chunk recovery, prepare 00:29:16.095 [2024-12-15 05:19:36.098945] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.120 ms 00:29:16.095 [2024-12-15 05:19:36.098953] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.669 [2024-12-15 05:19:36.716873] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.669 [2024-12-15 05:19:36.716919] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Chunk recovery, read vss 00:29:16.669 [2024-12-15 05:19:36.716931] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 617.697 ms 00:29:16.669 [2024-12-15 05:19:36.716938] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.669 [2024-12-15 05:19:36.718215] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.669 [2024-12-15 05:19:36.718254] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Chunk recovery, persist P2L map 00:29:16.669 [2024-12-15 05:19:36.718265] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.925 ms 00:29:16.669 [2024-12-15 05:19:36.718273] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.669 [2024-12-15 05:19:36.718615] ftl_nv_cache.c:2323:recover_open_chunk_close_chunk_cb: *NOTICE*: [FTL][ftl] Recovered chunk, offset = 262144, seq id 14 00:29:16.669 [2024-12-15 05:19:36.718639] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.669 [2024-12-15 05:19:36.718646] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Chunk recovery, close chunk 00:29:16.669 [2024-12-15 05:19:36.718659] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.343 ms 00:29:16.669 [2024-12-15 05:19:36.718665] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.669 [2024-12-15 05:19:36.718689] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.669 [2024-12-15 05:19:36.718699] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Chunk recovery, cleanup 00:29:16.669 [2024-12-15 05:19:36.718706] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.003 ms 00:29:16.669 [2024-12-15 05:19:36.718713] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:16.669 [2024-12-15 05:19:36.718739] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl] Management process finished, name 'Recover open chunk', duration = 619.927 ms, result 0 00:29:16.669 [2024-12-15 05:19:36.718767] ftl_nv_cache.c:2274:recover_open_chunk_prepare: *NOTICE*: [FTL][ftl] Start recovery open chunk, offset = 524288, seq id 15 00:29:16.669 [2024-12-15 05:19:36.718846] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:16.669 [2024-12-15 05:19:36.718855] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Chunk recovery, prepare 00:29:16.669 [2024-12-15 05:19:36.718862] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.080 ms 00:29:16.669 [2024-12-15 05:19:36.718867] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.242 [2024-12-15 05:19:37.285888] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:17.242 [2024-12-15 05:19:37.285959] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Chunk recovery, read vss 00:29:17.242 [2024-12-15 05:19:37.285974] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 566.703 ms 00:29:17.242 [2024-12-15 05:19:37.285983] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.242 [2024-12-15 05:19:37.287937] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:17.242 [2024-12-15 05:19:37.288139] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Chunk recovery, persist P2L map 00:29:17.242 [2024-12-15 05:19:37.288156] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1.596 ms 00:29:17.242 [2024-12-15 05:19:37.288163] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.503 [2024-12-15 05:19:37.414878] ftl_nv_cache.c:2323:recover_open_chunk_close_chunk_cb: *NOTICE*: [FTL][ftl] Recovered chunk, offset = 524288, seq id 15 00:29:17.503 [2024-12-15 05:19:37.415036] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:17.503 [2024-12-15 05:19:37.415049] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Chunk recovery, close chunk 00:29:17.503 [2024-12-15 05:19:37.415059] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 126.838 ms 00:29:17.503 [2024-12-15 05:19:37.415067] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.503 [2024-12-15 05:19:37.415167] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:17.503 [2024-12-15 05:19:37.415187] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Chunk recovery, cleanup 00:29:17.503 [2024-12-15 05:19:37.415197] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.006 ms 00:29:17.503 [2024-12-15 05:19:37.415204] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.503 [2024-12-15 05:19:37.415247] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl] Management process finished, name 'Recover open chunk', duration = 696.470 ms, result 0 00:29:17.503 [2024-12-15 05:19:37.415292] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl] FTL NV Cache: full chunks = 2, empty chunks = 2 00:29:17.503 [2024-12-15 05:19:37.415303] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl] FTL NV Cache: state loaded successfully 00:29:17.503 [2024-12-15 05:19:37.415313] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:17.503 [2024-12-15 05:19:37.415321] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Recover open chunks P2L 00:29:17.503 [2024-12-15 05:19:37.415329] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1316.518 ms 00:29:17.503 [2024-12-15 05:19:37.415340] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.503 [2024-12-15 05:19:37.415374] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:17.504 [2024-12-15 05:19:37.415386] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Deinitialize recovery 00:29:17.504 [2024-12-15 05:19:37.415395] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.001 ms 00:29:17.504 [2024-12-15 05:19:37.415407] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.504 [2024-12-15 05:19:37.423672] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 1 (of 2) MiB 00:29:17.504 [2024-12-15 05:19:37.423782] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:17.504 [2024-12-15 05:19:37.423795] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize L2P 00:29:17.504 [2024-12-15 05:19:37.423804] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 8.358 ms 00:29:17.504 [2024-12-15 05:19:37.423812] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.504 [2024-12-15 05:19:37.424505] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:17.504 [2024-12-15 05:19:37.424525] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Restore L2P from shared memory 00:29:17.504 [2024-12-15 05:19:37.424534] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.628 ms 00:29:17.504 [2024-12-15 05:19:37.424542] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.504 [2024-12-15 05:19:37.426763] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:17.504 [2024-12-15 05:19:37.426789] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Restore valid maps counters 00:29:17.504 [2024-12-15 05:19:37.426799] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 2.204 ms 00:29:17.504 [2024-12-15 05:19:37.426807] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.504 [2024-12-15 05:19:37.426864] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:17.504 [2024-12-15 05:19:37.426873] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Complete trim transaction 00:29:17.504 [2024-12-15 05:19:37.426882] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.002 ms 00:29:17.504 [2024-12-15 05:19:37.426889] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.504 [2024-12-15 05:19:37.426989] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:17.504 [2024-12-15 05:19:37.426999] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Finalize band initialization 00:29:17.504 [2024-12-15 05:19:37.427010] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.016 ms 00:29:17.504 [2024-12-15 05:19:37.427018] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.504 [2024-12-15 05:19:37.427037] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:17.504 [2024-12-15 05:19:37.427045] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Start core poller 00:29:17.504 [2024-12-15 05:19:37.427057] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.004 ms 00:29:17.504 [2024-12-15 05:19:37.427067] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.504 [2024-12-15 05:19:37.427102] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl] Self test skipped 00:29:17.504 [2024-12-15 05:19:37.427112] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:17.504 [2024-12-15 05:19:37.427123] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Self test on startup 00:29:17.504 [2024-12-15 05:19:37.427131] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.011 ms 00:29:17.504 [2024-12-15 05:19:37.427141] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.504 [2024-12-15 05:19:37.427190] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:17.504 [2024-12-15 05:19:37.427198] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Finalize initialization 00:29:17.504 [2024-12-15 05:19:37.427206] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.032 ms 00:29:17.504 [2024-12-15 05:19:37.427214] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:17.504 [2024-12-15 05:19:37.428102] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl] Management process finished, name 'FTL startup', duration = 1394.747 ms, result 0 00:29:17.504 [2024-12-15 05:19:37.440510] tcp.c: 756:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:29:17.504 [2024-12-15 05:19:37.456510] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl] FTL IO channel created on nvmf_tgt_poll_group_000 00:29:17.504 [2024-12-15 05:19:37.464614] tcp.c:1099:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 127.0.0.1 port 4420 *** 00:29:18.074 05:19:38 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:29:18.075 05:19:38 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@868 -- # return 0 00:29:18.075 05:19:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@93 -- # [[ -f /home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json ]] 00:29:18.075 05:19:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@95 -- # return 0 00:29:18.075 05:19:38 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@116 -- # test_validate_checksum 00:29:18.075 05:19:38 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@96 -- # skip=0 00:29:18.075 05:19:38 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@97 -- # (( i = 0 )) 00:29:18.075 05:19:38 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@97 -- # (( i < iterations )) 00:29:18.075 Validate MD5 checksum, iteration 1 00:29:18.075 05:19:38 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@98 -- # echo 'Validate MD5 checksum, iteration 1' 00:29:18.075 05:19:38 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@99 -- # tcp_dd --ib=ftln1 --of=/home/vagrant/spdk_repo/spdk/test/ftl/file --bs=1048576 --count=1024 --qd=2 --skip=0 00:29:18.075 05:19:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@198 -- # tcp_initiator_setup 00:29:18.075 05:19:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@151 -- # local 'rpc=/home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk.tgt.sock' 00:29:18.075 05:19:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@153 -- # [[ -f /home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json ]] 00:29:18.075 05:19:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@154 -- # return 0 00:29:18.075 05:19:38 ftl.ftl_upgrade_shutdown -- ftl/common.sh@199 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd '--cpumask=[1]' --rpc-socket=/var/tmp/spdk.tgt.sock --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json --ib=ftln1 --of=/home/vagrant/spdk_repo/spdk/test/ftl/file --bs=1048576 --count=1024 --qd=2 --skip=0 00:29:18.075 [2024-12-15 05:19:38.101967] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:29:18.075 [2024-12-15 05:19:38.102079] [ DPDK EAL parameters: spdk_dd --no-shconf -l 1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid96416 ] 00:29:18.336 [2024-12-15 05:19:38.250696] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:18.336 [2024-12-15 05:19:38.271607] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:29:19.720  [2024-12-15T05:19:40.120Z] Copying: 687/1024 [MB] (687 MBps) [2024-12-15T05:19:40.692Z] Copying: 1024/1024 [MB] (average 681 MBps) 00:29:20.552 00:29:20.552 05:19:40 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@100 -- # skip=1024 00:29:20.552 05:19:40 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@102 -- # md5sum /home/vagrant/spdk_repo/spdk/test/ftl/file 00:29:23.096 05:19:42 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@103 -- # cut -f1 '-d ' 00:29:23.096 Validate MD5 checksum, iteration 2 00:29:23.096 05:19:42 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@103 -- # sum=7277b8a181cfd328f6ae0145f52da3f0 00:29:23.096 05:19:42 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@105 -- # [[ 7277b8a181cfd328f6ae0145f52da3f0 != \7\2\7\7\b\8\a\1\8\1\c\f\d\3\2\8\f\6\a\e\0\1\4\5\f\5\2\d\a\3\f\0 ]] 00:29:23.096 05:19:42 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@97 -- # (( i++ )) 00:29:23.096 05:19:42 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@97 -- # (( i < iterations )) 00:29:23.096 05:19:42 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@98 -- # echo 'Validate MD5 checksum, iteration 2' 00:29:23.096 05:19:42 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@99 -- # tcp_dd --ib=ftln1 --of=/home/vagrant/spdk_repo/spdk/test/ftl/file --bs=1048576 --count=1024 --qd=2 --skip=1024 00:29:23.096 05:19:42 ftl.ftl_upgrade_shutdown -- ftl/common.sh@198 -- # tcp_initiator_setup 00:29:23.096 05:19:42 ftl.ftl_upgrade_shutdown -- ftl/common.sh@151 -- # local 'rpc=/home/vagrant/spdk_repo/spdk/scripts/rpc.py -s /var/tmp/spdk.tgt.sock' 00:29:23.096 05:19:42 ftl.ftl_upgrade_shutdown -- ftl/common.sh@153 -- # [[ -f /home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json ]] 00:29:23.096 05:19:42 ftl.ftl_upgrade_shutdown -- ftl/common.sh@154 -- # return 0 00:29:23.096 05:19:42 ftl.ftl_upgrade_shutdown -- ftl/common.sh@199 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd '--cpumask=[1]' --rpc-socket=/var/tmp/spdk.tgt.sock --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json --ib=ftln1 --of=/home/vagrant/spdk_repo/spdk/test/ftl/file --bs=1048576 --count=1024 --qd=2 --skip=1024 00:29:23.096 [2024-12-15 05:19:42.815081] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:29:23.096 [2024-12-15 05:19:42.815203] [ DPDK EAL parameters: spdk_dd --no-shconf -l 1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid96468 ] 00:29:23.096 [2024-12-15 05:19:42.974585] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:23.096 [2024-12-15 05:19:42.992148] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 1 00:29:24.483  [2024-12-15T05:19:45.194Z] Copying: 559/1024 [MB] (559 MBps) [2024-12-15T05:19:45.765Z] Copying: 1024/1024 [MB] (average 605 MBps) 00:29:25.625 00:29:25.625 05:19:45 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@100 -- # skip=2048 00:29:25.625 05:19:45 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@102 -- # md5sum /home/vagrant/spdk_repo/spdk/test/ftl/file 00:29:28.172 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@103 -- # cut -f1 '-d ' 00:29:28.172 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@103 -- # sum=069a163c4aad509a297c87c187f56e27 00:29:28.172 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@105 -- # [[ 069a163c4aad509a297c87c187f56e27 != \0\6\9\a\1\6\3\c\4\a\a\d\5\0\9\a\2\9\7\c\8\7\c\1\8\7\f\5\6\e\2\7 ]] 00:29:28.172 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@97 -- # (( i++ )) 00:29:28.172 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@97 -- # (( i < iterations )) 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@118 -- # trap - SIGINT SIGTERM EXIT 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@119 -- # cleanup 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@11 -- # trap - SIGINT SIGTERM EXIT 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@12 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/file 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@13 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/file.md5 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@14 -- # tcp_cleanup 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/common.sh@193 -- # tcp_target_cleanup 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/common.sh@144 -- # tcp_target_shutdown 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/common.sh@130 -- # [[ -n 96376 ]] 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- ftl/common.sh@131 -- # killprocess 96376 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@954 -- # '[' -z 96376 ']' 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@958 -- # kill -0 96376 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@959 -- # uname 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 96376 00:29:28.173 killing process with pid 96376 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@972 -- # echo 'killing process with pid 96376' 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@973 -- # kill 96376 00:29:28.173 05:19:47 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@978 -- # wait 96376 00:29:28.173 [2024-12-15 05:19:48.016591] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl] FTL IO channel destroy on nvmf_tgt_poll_group_000 00:29:28.173 [2024-12-15 05:19:48.019749] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.173 [2024-12-15 05:19:48.019782] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Deinit core IO channel 00:29:28.173 [2024-12-15 05:19:48.019792] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.003 ms 00:29:28.173 [2024-12-15 05:19:48.019799] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.173 [2024-12-15 05:19:48.019816] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl] FTL IO channel destroy on app_thread 00:29:28.173 [2024-12-15 05:19:48.020194] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.173 [2024-12-15 05:19:48.020234] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Unregister IO device 00:29:28.173 [2024-12-15 05:19:48.020243] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.368 ms 00:29:28.173 [2024-12-15 05:19:48.020249] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.173 [2024-12-15 05:19:48.020551] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.173 [2024-12-15 05:19:48.020580] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Stop core poller 00:29:28.173 [2024-12-15 05:19:48.020602] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.284 ms 00:29:28.173 [2024-12-15 05:19:48.020617] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.173 [2024-12-15 05:19:48.021758] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.173 [2024-12-15 05:19:48.021841] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Persist L2P 00:29:28.173 [2024-12-15 05:19:48.021886] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1.120 ms 00:29:28.173 [2024-12-15 05:19:48.021911] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.173 [2024-12-15 05:19:48.022811] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.173 [2024-12-15 05:19:48.022884] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Finish L2P trims 00:29:28.173 [2024-12-15 05:19:48.022924] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.867 ms 00:29:28.173 [2024-12-15 05:19:48.022941] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.173 [2024-12-15 05:19:48.024500] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.173 [2024-12-15 05:19:48.024590] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Persist NV cache metadata 00:29:28.173 [2024-12-15 05:19:48.024636] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1.521 ms 00:29:28.173 [2024-12-15 05:19:48.024653] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.173 [2024-12-15 05:19:48.026047] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.173 [2024-12-15 05:19:48.026134] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Persist valid map metadata 00:29:28.173 [2024-12-15 05:19:48.026171] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1.355 ms 00:29:28.173 [2024-12-15 05:19:48.026188] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.173 [2024-12-15 05:19:48.026255] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.173 [2024-12-15 05:19:48.026273] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Persist P2L metadata 00:29:28.173 [2024-12-15 05:19:48.026289] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.033 ms 00:29:28.173 [2024-12-15 05:19:48.026308] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.173 [2024-12-15 05:19:48.027975] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.173 [2024-12-15 05:19:48.028060] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Persist band info metadata 00:29:28.173 [2024-12-15 05:19:48.028097] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1.645 ms 00:29:28.173 [2024-12-15 05:19:48.028113] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.173 [2024-12-15 05:19:48.030553] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.173 [2024-12-15 05:19:48.030636] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Persist trim metadata 00:29:28.173 [2024-12-15 05:19:48.030647] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 2.409 ms 00:29:28.173 [2024-12-15 05:19:48.030652] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.173 [2024-12-15 05:19:48.032356] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.173 [2024-12-15 05:19:48.032470] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Persist superblock 00:29:28.173 [2024-12-15 05:19:48.032515] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1.679 ms 00:29:28.173 [2024-12-15 05:19:48.032532] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.173 [2024-12-15 05:19:48.034469] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.173 [2024-12-15 05:19:48.034558] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Set FTL clean state 00:29:28.173 [2024-12-15 05:19:48.034617] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1.820 ms 00:29:28.173 [2024-12-15 05:19:48.034633] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.173 [2024-12-15 05:19:48.034693] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Bands validity: 00:29:28.173 [2024-12-15 05:19:48.034718] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 1: 261120 / 261120 wr_cnt: 1 state: closed 00:29:28.173 [2024-12-15 05:19:48.034742] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 2: 261120 / 261120 wr_cnt: 1 state: closed 00:29:28.173 [2024-12-15 05:19:48.034797] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 3: 2048 / 261120 wr_cnt: 1 state: closed 00:29:28.173 [2024-12-15 05:19:48.034821] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.034844] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.034889] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.034962] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.034984] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.035007] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.035029] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.035051] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.035067] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.035074] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.035080] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.035086] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.035091] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.035098] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.035104] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:29:28.173 [2024-12-15 05:19:48.035111] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] 00:29:28.173 [2024-12-15 05:19:48.035117] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] device UUID: b0da1c54-e331-4f00-a60a-a11723dff425 00:29:28.173 [2024-12-15 05:19:48.035124] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] total valid LBAs: 524288 00:29:28.173 [2024-12-15 05:19:48.035129] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] total writes: 320 00:29:28.173 [2024-12-15 05:19:48.035134] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] user writes: 0 00:29:28.173 [2024-12-15 05:19:48.035140] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] WAF: inf 00:29:28.173 [2024-12-15 05:19:48.035146] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] limits: 00:29:28.173 [2024-12-15 05:19:48.035151] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] crit: 0 00:29:28.173 [2024-12-15 05:19:48.035162] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] high: 0 00:29:28.173 [2024-12-15 05:19:48.035167] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] low: 0 00:29:28.173 [2024-12-15 05:19:48.035172] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl] start: 0 00:29:28.173 [2024-12-15 05:19:48.035178] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.173 [2024-12-15 05:19:48.035185] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Dump statistics 00:29:28.173 [2024-12-15 05:19:48.035192] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.486 ms 00:29:28.173 [2024-12-15 05:19:48.035197] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.173 [2024-12-15 05:19:48.036563] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.173 [2024-12-15 05:19:48.036646] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Deinitialize L2P 00:29:28.173 [2024-12-15 05:19:48.036684] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 1.351 ms 00:29:28.173 [2024-12-15 05:19:48.036700] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.173 [2024-12-15 05:19:48.036785] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Action 00:29:28.174 [2024-12-15 05:19:48.036802] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Deinitialize P2L checkpointing 00:29:28.174 [2024-12-15 05:19:48.036817] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.054 ms 00:29:28.174 [2024-12-15 05:19:48.036831] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.174 [2024-12-15 05:19:48.041423] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:29:28.174 [2024-12-15 05:19:48.041519] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize reloc 00:29:28.174 [2024-12-15 05:19:48.041558] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:29:28.174 [2024-12-15 05:19:48.041579] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.174 [2024-12-15 05:19:48.041611] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:29:28.174 [2024-12-15 05:19:48.041626] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize bands metadata 00:29:28.174 [2024-12-15 05:19:48.041641] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:29:28.174 [2024-12-15 05:19:48.041655] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.174 [2024-12-15 05:19:48.041705] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:29:28.174 [2024-12-15 05:19:48.041724] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize trim map 00:29:28.174 [2024-12-15 05:19:48.041745] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:29:28.174 [2024-12-15 05:19:48.041793] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.174 [2024-12-15 05:19:48.041823] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:29:28.174 [2024-12-15 05:19:48.041830] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize valid map 00:29:28.174 [2024-12-15 05:19:48.041837] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:29:28.174 [2024-12-15 05:19:48.041843] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.174 [2024-12-15 05:19:48.050062] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:29:28.174 [2024-12-15 05:19:48.050095] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize NV cache 00:29:28.174 [2024-12-15 05:19:48.050103] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:29:28.174 [2024-12-15 05:19:48.050109] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.174 [2024-12-15 05:19:48.056235] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:29:28.174 [2024-12-15 05:19:48.056263] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize metadata 00:29:28.174 [2024-12-15 05:19:48.056271] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:29:28.174 [2024-12-15 05:19:48.056277] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.174 [2024-12-15 05:19:48.056327] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:29:28.174 [2024-12-15 05:19:48.056334] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize core IO channel 00:29:28.174 [2024-12-15 05:19:48.056340] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:29:28.174 [2024-12-15 05:19:48.056346] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.174 [2024-12-15 05:19:48.056376] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:29:28.174 [2024-12-15 05:19:48.056385] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize bands 00:29:28.174 [2024-12-15 05:19:48.056392] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:29:28.174 [2024-12-15 05:19:48.056398] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.174 [2024-12-15 05:19:48.056464] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:29:28.174 [2024-12-15 05:19:48.056474] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize memory pools 00:29:28.174 [2024-12-15 05:19:48.056480] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:29:28.174 [2024-12-15 05:19:48.056485] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.174 [2024-12-15 05:19:48.056509] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:29:28.174 [2024-12-15 05:19:48.056516] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Initialize superblock 00:29:28.174 [2024-12-15 05:19:48.056524] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:29:28.174 [2024-12-15 05:19:48.056530] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.174 [2024-12-15 05:19:48.056558] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:29:28.174 [2024-12-15 05:19:48.056570] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Open cache bdev 00:29:28.174 [2024-12-15 05:19:48.056577] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:29:28.174 [2024-12-15 05:19:48.056583] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.174 [2024-12-15 05:19:48.056620] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl] Rollback 00:29:28.174 [2024-12-15 05:19:48.056629] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl] name: Open base bdev 00:29:28.174 [2024-12-15 05:19:48.056636] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl] duration: 0.000 ms 00:29:28.174 [2024-12-15 05:19:48.056642] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl] status: 0 00:29:28.174 [2024-12-15 05:19:48.056742] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl] Management process finished, name 'FTL shutdown', duration = 36.972 ms, result 0 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- ftl/common.sh@132 -- # unset spdk_tgt_pid 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- ftl/common.sh@145 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- ftl/common.sh@194 -- # tcp_initiator_cleanup 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- ftl/common.sh@188 -- # tcp_initiator_shutdown 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- ftl/common.sh@181 -- # [[ -n '' ]] 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- ftl/common.sh@189 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:29:28.174 Remove shared memory files 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- ftl/upgrade_shutdown.sh@15 -- # remove_shm 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- ftl/common.sh@204 -- # echo Remove shared memory files 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- ftl/common.sh@205 -- # rm -f rm -f 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- ftl/common.sh@206 -- # rm -f rm -f 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- ftl/common.sh@207 -- # rm -f rm -f /dev/shm/spdk_tgt_trace.pid96182 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- ftl/common.sh@208 -- # rm -f rm -f /dev/shm/iscsi 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- ftl/common.sh@209 -- # rm -f rm -f 00:29:28.174 ************************************ 00:29:28.174 END TEST ftl_upgrade_shutdown 00:29:28.174 ************************************ 00:29:28.174 00:29:28.174 real 1m12.090s 00:29:28.174 user 1m37.790s 00:29:28.174 sys 0m18.428s 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@1130 -- # xtrace_disable 00:29:28.174 05:19:48 ftl.ftl_upgrade_shutdown -- common/autotest_common.sh@10 -- # set +x 00:29:28.174 05:19:48 ftl -- ftl/ftl.sh@80 -- # [[ 1 -eq 1 ]] 00:29:28.174 05:19:48 ftl -- ftl/ftl.sh@81 -- # run_test ftl_restore_fast /home/vagrant/spdk_repo/spdk/test/ftl/restore.sh -f -c 0000:00:10.0 0000:00:11.0 00:29:28.174 05:19:48 ftl -- common/autotest_common.sh@1105 -- # '[' 6 -le 1 ']' 00:29:28.174 05:19:48 ftl -- common/autotest_common.sh@1111 -- # xtrace_disable 00:29:28.174 05:19:48 ftl -- common/autotest_common.sh@10 -- # set +x 00:29:28.174 ************************************ 00:29:28.174 START TEST ftl_restore_fast 00:29:28.174 ************************************ 00:29:28.174 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@1129 -- # /home/vagrant/spdk_repo/spdk/test/ftl/restore.sh -f -c 0000:00:10.0 0000:00:11.0 00:29:28.435 * Looking for test storage... 00:29:28.435 * Found test storage at /home/vagrant/spdk_repo/spdk/test/ftl 00:29:28.435 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@1710 -- # [[ y == y ]] 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@1711 -- # lcov --version 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@1711 -- # awk '{print $NF}' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@1711 -- # lt 1.15 2 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@373 -- # cmp_versions 1.15 '<' 2 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@333 -- # local ver1 ver1_l 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@334 -- # local ver2 ver2_l 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@336 -- # IFS=.-: 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@336 -- # read -ra ver1 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@337 -- # IFS=.-: 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@337 -- # read -ra ver2 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@338 -- # local 'op=<' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@340 -- # ver1_l=2 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@341 -- # ver2_l=1 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@343 -- # local lt=0 gt=0 eq=0 v 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@344 -- # case "$op" in 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@345 -- # : 1 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@364 -- # (( v = 0 )) 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@364 -- # (( v < (ver1_l > ver2_l ? ver1_l : ver2_l) )) 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@365 -- # decimal 1 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@353 -- # local d=1 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@354 -- # [[ 1 =~ ^[0-9]+$ ]] 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@355 -- # echo 1 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@365 -- # ver1[v]=1 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@366 -- # decimal 2 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@353 -- # local d=2 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@354 -- # [[ 2 =~ ^[0-9]+$ ]] 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@355 -- # echo 2 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@366 -- # ver2[v]=2 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@367 -- # (( ver1[v] > ver2[v] )) 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@368 -- # (( ver1[v] < ver2[v] )) 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- scripts/common.sh@368 -- # return 0 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@1712 -- # lcov_rc_opt='--rc lcov_branch_coverage=1 --rc lcov_function_coverage=1' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@1724 -- # export 'LCOV_OPTS= 00:29:28.436 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:29:28.436 --rc genhtml_branch_coverage=1 00:29:28.436 --rc genhtml_function_coverage=1 00:29:28.436 --rc genhtml_legend=1 00:29:28.436 --rc geninfo_all_blocks=1 00:29:28.436 --rc geninfo_unexecuted_blocks=1 00:29:28.436 00:29:28.436 ' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@1724 -- # LCOV_OPTS=' 00:29:28.436 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:29:28.436 --rc genhtml_branch_coverage=1 00:29:28.436 --rc genhtml_function_coverage=1 00:29:28.436 --rc genhtml_legend=1 00:29:28.436 --rc geninfo_all_blocks=1 00:29:28.436 --rc geninfo_unexecuted_blocks=1 00:29:28.436 00:29:28.436 ' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@1725 -- # export 'LCOV=lcov 00:29:28.436 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:29:28.436 --rc genhtml_branch_coverage=1 00:29:28.436 --rc genhtml_function_coverage=1 00:29:28.436 --rc genhtml_legend=1 00:29:28.436 --rc geninfo_all_blocks=1 00:29:28.436 --rc geninfo_unexecuted_blocks=1 00:29:28.436 00:29:28.436 ' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@1725 -- # LCOV='lcov 00:29:28.436 --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 00:29:28.436 --rc genhtml_branch_coverage=1 00:29:28.436 --rc genhtml_function_coverage=1 00:29:28.436 --rc genhtml_legend=1 00:29:28.436 --rc geninfo_all_blocks=1 00:29:28.436 --rc geninfo_unexecuted_blocks=1 00:29:28.436 00:29:28.436 ' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@9 -- # source /home/vagrant/spdk_repo/spdk/test/ftl/common.sh 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@8 -- # dirname /home/vagrant/spdk_repo/spdk/test/ftl/restore.sh 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@8 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@8 -- # testdir=/home/vagrant/spdk_repo/spdk/test/ftl 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@9 -- # readlink -f /home/vagrant/spdk_repo/spdk/test/ftl/../.. 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@9 -- # rootdir=/home/vagrant/spdk_repo/spdk 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@10 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@12 -- # export 'ftl_tgt_core_mask=[0]' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@12 -- # ftl_tgt_core_mask='[0]' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@14 -- # export spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@14 -- # spdk_tgt_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@15 -- # export 'spdk_tgt_cpumask=[0]' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@15 -- # spdk_tgt_cpumask='[0]' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@16 -- # export spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@16 -- # spdk_tgt_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/tgt.json 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@17 -- # export spdk_tgt_pid= 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@17 -- # spdk_tgt_pid= 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@19 -- # export spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@19 -- # spdk_ini_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@20 -- # export 'spdk_ini_cpumask=[1]' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@20 -- # spdk_ini_cpumask='[1]' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@21 -- # export spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@21 -- # spdk_ini_rpc=/var/tmp/spdk.tgt.sock 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@22 -- # export spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@22 -- # spdk_ini_cnfg=/home/vagrant/spdk_repo/spdk/test/ftl/config/ini.json 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@23 -- # export spdk_ini_pid= 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@23 -- # spdk_ini_pid= 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@25 -- # export spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/common.sh@25 -- # spdk_dd_bin=/home/vagrant/spdk_repo/spdk/build/bin/spdk_dd 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@11 -- # rpc_py=/home/vagrant/spdk_repo/spdk/scripts/rpc.py 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@13 -- # mktemp -d 00:29:28.436 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@13 -- # mount_dir=/tmp/tmp.dSAytW3Ku8 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@15 -- # getopts :u:c:f opt 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@16 -- # case $opt in 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@19 -- # fast_shutdown=1 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@15 -- # getopts :u:c:f opt 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@16 -- # case $opt in 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@18 -- # nv_cache=0000:00:10.0 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@15 -- # getopts :u:c:f opt 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@23 -- # shift 3 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@24 -- # device=0000:00:11.0 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@25 -- # timeout=240 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@36 -- # trap 'restore_kill; exit 1' SIGINT SIGTERM EXIT 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@39 -- # svcpid=96602 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@41 -- # waitforlisten 96602 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- ftl/restore.sh@38 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@835 -- # '[' -z 96602 ']' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@840 -- # local max_retries=100 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@844 -- # xtrace_disable 00:29:28.436 05:19:48 ftl.ftl_restore_fast -- common/autotest_common.sh@10 -- # set +x 00:29:28.436 [2024-12-15 05:19:48.519363] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:29:28.436 [2024-12-15 05:19:48.519671] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid96602 ] 00:29:28.697 [2024-12-15 05:19:48.674429] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:28.697 [2024-12-15 05:19:48.692226] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:29:29.269 05:19:49 ftl.ftl_restore_fast -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:29:29.269 05:19:49 ftl.ftl_restore_fast -- common/autotest_common.sh@868 -- # return 0 00:29:29.269 05:19:49 ftl.ftl_restore_fast -- ftl/restore.sh@43 -- # create_base_bdev nvme0 0000:00:11.0 103424 00:29:29.269 05:19:49 ftl.ftl_restore_fast -- ftl/common.sh@54 -- # local name=nvme0 00:29:29.269 05:19:49 ftl.ftl_restore_fast -- ftl/common.sh@55 -- # local base_bdf=0000:00:11.0 00:29:29.269 05:19:49 ftl.ftl_restore_fast -- ftl/common.sh@56 -- # local size=103424 00:29:29.269 05:19:49 ftl.ftl_restore_fast -- ftl/common.sh@59 -- # local base_bdev 00:29:29.269 05:19:49 ftl.ftl_restore_fast -- ftl/common.sh@60 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t PCIe -a 0000:00:11.0 00:29:29.528 05:19:49 ftl.ftl_restore_fast -- ftl/common.sh@60 -- # base_bdev=nvme0n1 00:29:29.528 05:19:49 ftl.ftl_restore_fast -- ftl/common.sh@62 -- # local base_size 00:29:29.528 05:19:49 ftl.ftl_restore_fast -- ftl/common.sh@63 -- # get_bdev_size nvme0n1 00:29:29.528 05:19:49 ftl.ftl_restore_fast -- common/autotest_common.sh@1382 -- # local bdev_name=nvme0n1 00:29:29.528 05:19:49 ftl.ftl_restore_fast -- common/autotest_common.sh@1383 -- # local bdev_info 00:29:29.528 05:19:49 ftl.ftl_restore_fast -- common/autotest_common.sh@1384 -- # local bs 00:29:29.528 05:19:49 ftl.ftl_restore_fast -- common/autotest_common.sh@1385 -- # local nb 00:29:29.528 05:19:49 ftl.ftl_restore_fast -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b nvme0n1 00:29:29.826 05:19:49 ftl.ftl_restore_fast -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:29:29.826 { 00:29:29.826 "name": "nvme0n1", 00:29:29.826 "aliases": [ 00:29:29.826 "d4597ea9-6a7b-44f8-9683-2f618a0c09bc" 00:29:29.826 ], 00:29:29.826 "product_name": "NVMe disk", 00:29:29.826 "block_size": 4096, 00:29:29.826 "num_blocks": 1310720, 00:29:29.826 "uuid": "d4597ea9-6a7b-44f8-9683-2f618a0c09bc", 00:29:29.826 "numa_id": -1, 00:29:29.826 "assigned_rate_limits": { 00:29:29.826 "rw_ios_per_sec": 0, 00:29:29.826 "rw_mbytes_per_sec": 0, 00:29:29.826 "r_mbytes_per_sec": 0, 00:29:29.826 "w_mbytes_per_sec": 0 00:29:29.826 }, 00:29:29.826 "claimed": true, 00:29:29.826 "claim_type": "read_many_write_one", 00:29:29.826 "zoned": false, 00:29:29.826 "supported_io_types": { 00:29:29.826 "read": true, 00:29:29.826 "write": true, 00:29:29.826 "unmap": true, 00:29:29.826 "flush": true, 00:29:29.826 "reset": true, 00:29:29.826 "nvme_admin": true, 00:29:29.826 "nvme_io": true, 00:29:29.826 "nvme_io_md": false, 00:29:29.826 "write_zeroes": true, 00:29:29.826 "zcopy": false, 00:29:29.826 "get_zone_info": false, 00:29:29.826 "zone_management": false, 00:29:29.826 "zone_append": false, 00:29:29.826 "compare": true, 00:29:29.826 "compare_and_write": false, 00:29:29.826 "abort": true, 00:29:29.826 "seek_hole": false, 00:29:29.826 "seek_data": false, 00:29:29.826 "copy": true, 00:29:29.826 "nvme_iov_md": false 00:29:29.826 }, 00:29:29.826 "driver_specific": { 00:29:29.826 "nvme": [ 00:29:29.826 { 00:29:29.826 "pci_address": "0000:00:11.0", 00:29:29.826 "trid": { 00:29:29.826 "trtype": "PCIe", 00:29:29.826 "traddr": "0000:00:11.0" 00:29:29.826 }, 00:29:29.826 "ctrlr_data": { 00:29:29.826 "cntlid": 0, 00:29:29.826 "vendor_id": "0x1b36", 00:29:29.826 "model_number": "QEMU NVMe Ctrl", 00:29:29.826 "serial_number": "12341", 00:29:29.826 "firmware_revision": "8.0.0", 00:29:29.826 "subnqn": "nqn.2019-08.org.qemu:12341", 00:29:29.826 "oacs": { 00:29:29.826 "security": 0, 00:29:29.826 "format": 1, 00:29:29.826 "firmware": 0, 00:29:29.826 "ns_manage": 1 00:29:29.826 }, 00:29:29.826 "multi_ctrlr": false, 00:29:29.826 "ana_reporting": false 00:29:29.826 }, 00:29:29.826 "vs": { 00:29:29.826 "nvme_version": "1.4" 00:29:29.826 }, 00:29:29.826 "ns_data": { 00:29:29.826 "id": 1, 00:29:29.826 "can_share": false 00:29:29.826 } 00:29:29.826 } 00:29:29.826 ], 00:29:29.826 "mp_policy": "active_passive" 00:29:29.826 } 00:29:29.826 } 00:29:29.826 ]' 00:29:29.826 05:19:49 ftl.ftl_restore_fast -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:29:29.826 05:19:49 ftl.ftl_restore_fast -- common/autotest_common.sh@1387 -- # bs=4096 00:29:29.826 05:19:49 ftl.ftl_restore_fast -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:29:29.827 05:19:49 ftl.ftl_restore_fast -- common/autotest_common.sh@1388 -- # nb=1310720 00:29:29.827 05:19:49 ftl.ftl_restore_fast -- common/autotest_common.sh@1391 -- # bdev_size=5120 00:29:29.827 05:19:49 ftl.ftl_restore_fast -- common/autotest_common.sh@1392 -- # echo 5120 00:29:29.827 05:19:49 ftl.ftl_restore_fast -- ftl/common.sh@63 -- # base_size=5120 00:29:29.827 05:19:49 ftl.ftl_restore_fast -- ftl/common.sh@64 -- # [[ 103424 -le 5120 ]] 00:29:29.827 05:19:49 ftl.ftl_restore_fast -- ftl/common.sh@67 -- # clear_lvols 00:29:29.827 05:19:49 ftl.ftl_restore_fast -- ftl/common.sh@28 -- # jq -r '.[] | .uuid' 00:29:29.827 05:19:49 ftl.ftl_restore_fast -- ftl/common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_get_lvstores 00:29:30.110 05:19:50 ftl.ftl_restore_fast -- ftl/common.sh@28 -- # stores=4dc5f2db-978b-4031-bbe3-e1880a01b422 00:29:30.110 05:19:50 ftl.ftl_restore_fast -- ftl/common.sh@29 -- # for lvs in $stores 00:29:30.110 05:19:50 ftl.ftl_restore_fast -- ftl/common.sh@30 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -u 4dc5f2db-978b-4031-bbe3-e1880a01b422 00:29:30.370 05:19:50 ftl.ftl_restore_fast -- ftl/common.sh@68 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_create_lvstore nvme0n1 lvs 00:29:30.370 05:19:50 ftl.ftl_restore_fast -- ftl/common.sh@68 -- # lvs=05487b5d-3ef5-4c8e-9336-20d71573bb2d 00:29:30.370 05:19:50 ftl.ftl_restore_fast -- ftl/common.sh@69 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_create nvme0n1p0 103424 -t -u 05487b5d-3ef5-4c8e-9336-20d71573bb2d 00:29:30.632 05:19:50 ftl.ftl_restore_fast -- ftl/restore.sh@43 -- # split_bdev=af91b8b9-d134-4abb-8137-54bd2f4af465 00:29:30.632 05:19:50 ftl.ftl_restore_fast -- ftl/restore.sh@44 -- # '[' -n 0000:00:10.0 ']' 00:29:30.632 05:19:50 ftl.ftl_restore_fast -- ftl/restore.sh@45 -- # create_nv_cache_bdev nvc0 0000:00:10.0 af91b8b9-d134-4abb-8137-54bd2f4af465 00:29:30.632 05:19:50 ftl.ftl_restore_fast -- ftl/common.sh@35 -- # local name=nvc0 00:29:30.632 05:19:50 ftl.ftl_restore_fast -- ftl/common.sh@36 -- # local cache_bdf=0000:00:10.0 00:29:30.632 05:19:50 ftl.ftl_restore_fast -- ftl/common.sh@37 -- # local base_bdev=af91b8b9-d134-4abb-8137-54bd2f4af465 00:29:30.632 05:19:50 ftl.ftl_restore_fast -- ftl/common.sh@38 -- # local cache_size= 00:29:30.632 05:19:50 ftl.ftl_restore_fast -- ftl/common.sh@41 -- # get_bdev_size af91b8b9-d134-4abb-8137-54bd2f4af465 00:29:30.632 05:19:50 ftl.ftl_restore_fast -- common/autotest_common.sh@1382 -- # local bdev_name=af91b8b9-d134-4abb-8137-54bd2f4af465 00:29:30.632 05:19:50 ftl.ftl_restore_fast -- common/autotest_common.sh@1383 -- # local bdev_info 00:29:30.632 05:19:50 ftl.ftl_restore_fast -- common/autotest_common.sh@1384 -- # local bs 00:29:30.632 05:19:50 ftl.ftl_restore_fast -- common/autotest_common.sh@1385 -- # local nb 00:29:30.632 05:19:50 ftl.ftl_restore_fast -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b af91b8b9-d134-4abb-8137-54bd2f4af465 00:29:30.893 05:19:50 ftl.ftl_restore_fast -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:29:30.893 { 00:29:30.893 "name": "af91b8b9-d134-4abb-8137-54bd2f4af465", 00:29:30.893 "aliases": [ 00:29:30.893 "lvs/nvme0n1p0" 00:29:30.893 ], 00:29:30.893 "product_name": "Logical Volume", 00:29:30.893 "block_size": 4096, 00:29:30.893 "num_blocks": 26476544, 00:29:30.893 "uuid": "af91b8b9-d134-4abb-8137-54bd2f4af465", 00:29:30.893 "assigned_rate_limits": { 00:29:30.893 "rw_ios_per_sec": 0, 00:29:30.893 "rw_mbytes_per_sec": 0, 00:29:30.893 "r_mbytes_per_sec": 0, 00:29:30.893 "w_mbytes_per_sec": 0 00:29:30.893 }, 00:29:30.893 "claimed": false, 00:29:30.893 "zoned": false, 00:29:30.893 "supported_io_types": { 00:29:30.893 "read": true, 00:29:30.893 "write": true, 00:29:30.893 "unmap": true, 00:29:30.893 "flush": false, 00:29:30.893 "reset": true, 00:29:30.893 "nvme_admin": false, 00:29:30.893 "nvme_io": false, 00:29:30.893 "nvme_io_md": false, 00:29:30.893 "write_zeroes": true, 00:29:30.893 "zcopy": false, 00:29:30.893 "get_zone_info": false, 00:29:30.893 "zone_management": false, 00:29:30.893 "zone_append": false, 00:29:30.893 "compare": false, 00:29:30.893 "compare_and_write": false, 00:29:30.893 "abort": false, 00:29:30.893 "seek_hole": true, 00:29:30.893 "seek_data": true, 00:29:30.893 "copy": false, 00:29:30.893 "nvme_iov_md": false 00:29:30.893 }, 00:29:30.893 "driver_specific": { 00:29:30.893 "lvol": { 00:29:30.893 "lvol_store_uuid": "05487b5d-3ef5-4c8e-9336-20d71573bb2d", 00:29:30.893 "base_bdev": "nvme0n1", 00:29:30.893 "thin_provision": true, 00:29:30.893 "num_allocated_clusters": 0, 00:29:30.893 "snapshot": false, 00:29:30.893 "clone": false, 00:29:30.893 "esnap_clone": false 00:29:30.893 } 00:29:30.893 } 00:29:30.893 } 00:29:30.893 ]' 00:29:30.893 05:19:50 ftl.ftl_restore_fast -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:29:30.893 05:19:50 ftl.ftl_restore_fast -- common/autotest_common.sh@1387 -- # bs=4096 00:29:30.893 05:19:50 ftl.ftl_restore_fast -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:29:30.893 05:19:50 ftl.ftl_restore_fast -- common/autotest_common.sh@1388 -- # nb=26476544 00:29:30.893 05:19:50 ftl.ftl_restore_fast -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:29:30.893 05:19:50 ftl.ftl_restore_fast -- common/autotest_common.sh@1392 -- # echo 103424 00:29:30.893 05:19:50 ftl.ftl_restore_fast -- ftl/common.sh@41 -- # local base_size=5171 00:29:30.893 05:19:50 ftl.ftl_restore_fast -- ftl/common.sh@44 -- # local nvc_bdev 00:29:30.893 05:19:50 ftl.ftl_restore_fast -- ftl/common.sh@45 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvc0 -t PCIe -a 0000:00:10.0 00:29:31.154 05:19:51 ftl.ftl_restore_fast -- ftl/common.sh@45 -- # nvc_bdev=nvc0n1 00:29:31.154 05:19:51 ftl.ftl_restore_fast -- ftl/common.sh@47 -- # [[ -z '' ]] 00:29:31.154 05:19:51 ftl.ftl_restore_fast -- ftl/common.sh@48 -- # get_bdev_size af91b8b9-d134-4abb-8137-54bd2f4af465 00:29:31.154 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1382 -- # local bdev_name=af91b8b9-d134-4abb-8137-54bd2f4af465 00:29:31.154 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1383 -- # local bdev_info 00:29:31.154 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1384 -- # local bs 00:29:31.154 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1385 -- # local nb 00:29:31.154 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b af91b8b9-d134-4abb-8137-54bd2f4af465 00:29:31.415 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:29:31.415 { 00:29:31.415 "name": "af91b8b9-d134-4abb-8137-54bd2f4af465", 00:29:31.415 "aliases": [ 00:29:31.415 "lvs/nvme0n1p0" 00:29:31.415 ], 00:29:31.415 "product_name": "Logical Volume", 00:29:31.415 "block_size": 4096, 00:29:31.415 "num_blocks": 26476544, 00:29:31.415 "uuid": "af91b8b9-d134-4abb-8137-54bd2f4af465", 00:29:31.415 "assigned_rate_limits": { 00:29:31.415 "rw_ios_per_sec": 0, 00:29:31.415 "rw_mbytes_per_sec": 0, 00:29:31.415 "r_mbytes_per_sec": 0, 00:29:31.415 "w_mbytes_per_sec": 0 00:29:31.415 }, 00:29:31.415 "claimed": false, 00:29:31.415 "zoned": false, 00:29:31.415 "supported_io_types": { 00:29:31.415 "read": true, 00:29:31.415 "write": true, 00:29:31.415 "unmap": true, 00:29:31.415 "flush": false, 00:29:31.415 "reset": true, 00:29:31.415 "nvme_admin": false, 00:29:31.415 "nvme_io": false, 00:29:31.415 "nvme_io_md": false, 00:29:31.415 "write_zeroes": true, 00:29:31.415 "zcopy": false, 00:29:31.415 "get_zone_info": false, 00:29:31.415 "zone_management": false, 00:29:31.415 "zone_append": false, 00:29:31.415 "compare": false, 00:29:31.415 "compare_and_write": false, 00:29:31.415 "abort": false, 00:29:31.415 "seek_hole": true, 00:29:31.415 "seek_data": true, 00:29:31.415 "copy": false, 00:29:31.415 "nvme_iov_md": false 00:29:31.415 }, 00:29:31.415 "driver_specific": { 00:29:31.415 "lvol": { 00:29:31.415 "lvol_store_uuid": "05487b5d-3ef5-4c8e-9336-20d71573bb2d", 00:29:31.415 "base_bdev": "nvme0n1", 00:29:31.415 "thin_provision": true, 00:29:31.415 "num_allocated_clusters": 0, 00:29:31.415 "snapshot": false, 00:29:31.415 "clone": false, 00:29:31.415 "esnap_clone": false 00:29:31.415 } 00:29:31.415 } 00:29:31.415 } 00:29:31.415 ]' 00:29:31.415 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:29:31.415 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1387 -- # bs=4096 00:29:31.415 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:29:31.415 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1388 -- # nb=26476544 00:29:31.415 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:29:31.415 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1392 -- # echo 103424 00:29:31.415 05:19:51 ftl.ftl_restore_fast -- ftl/common.sh@48 -- # cache_size=5171 00:29:31.415 05:19:51 ftl.ftl_restore_fast -- ftl/common.sh@50 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_split_create nvc0n1 -s 5171 1 00:29:31.676 05:19:51 ftl.ftl_restore_fast -- ftl/restore.sh@45 -- # nvc_bdev=nvc0n1p0 00:29:31.676 05:19:51 ftl.ftl_restore_fast -- ftl/restore.sh@48 -- # get_bdev_size af91b8b9-d134-4abb-8137-54bd2f4af465 00:29:31.676 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1382 -- # local bdev_name=af91b8b9-d134-4abb-8137-54bd2f4af465 00:29:31.676 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1383 -- # local bdev_info 00:29:31.676 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1384 -- # local bs 00:29:31.676 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1385 -- # local nb 00:29:31.676 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1386 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_get_bdevs -b af91b8b9-d134-4abb-8137-54bd2f4af465 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1386 -- # bdev_info='[ 00:29:31.935 { 00:29:31.935 "name": "af91b8b9-d134-4abb-8137-54bd2f4af465", 00:29:31.935 "aliases": [ 00:29:31.935 "lvs/nvme0n1p0" 00:29:31.935 ], 00:29:31.935 "product_name": "Logical Volume", 00:29:31.935 "block_size": 4096, 00:29:31.935 "num_blocks": 26476544, 00:29:31.935 "uuid": "af91b8b9-d134-4abb-8137-54bd2f4af465", 00:29:31.935 "assigned_rate_limits": { 00:29:31.935 "rw_ios_per_sec": 0, 00:29:31.935 "rw_mbytes_per_sec": 0, 00:29:31.935 "r_mbytes_per_sec": 0, 00:29:31.935 "w_mbytes_per_sec": 0 00:29:31.935 }, 00:29:31.935 "claimed": false, 00:29:31.935 "zoned": false, 00:29:31.935 "supported_io_types": { 00:29:31.935 "read": true, 00:29:31.935 "write": true, 00:29:31.935 "unmap": true, 00:29:31.935 "flush": false, 00:29:31.935 "reset": true, 00:29:31.935 "nvme_admin": false, 00:29:31.935 "nvme_io": false, 00:29:31.935 "nvme_io_md": false, 00:29:31.935 "write_zeroes": true, 00:29:31.935 "zcopy": false, 00:29:31.935 "get_zone_info": false, 00:29:31.935 "zone_management": false, 00:29:31.935 "zone_append": false, 00:29:31.935 "compare": false, 00:29:31.935 "compare_and_write": false, 00:29:31.935 "abort": false, 00:29:31.935 "seek_hole": true, 00:29:31.935 "seek_data": true, 00:29:31.935 "copy": false, 00:29:31.935 "nvme_iov_md": false 00:29:31.935 }, 00:29:31.935 "driver_specific": { 00:29:31.935 "lvol": { 00:29:31.935 "lvol_store_uuid": "05487b5d-3ef5-4c8e-9336-20d71573bb2d", 00:29:31.935 "base_bdev": "nvme0n1", 00:29:31.935 "thin_provision": true, 00:29:31.935 "num_allocated_clusters": 0, 00:29:31.935 "snapshot": false, 00:29:31.935 "clone": false, 00:29:31.935 "esnap_clone": false 00:29:31.935 } 00:29:31.935 } 00:29:31.935 } 00:29:31.935 ]' 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1387 -- # jq '.[] .block_size' 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1387 -- # bs=4096 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1388 -- # jq '.[] .num_blocks' 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1388 -- # nb=26476544 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1391 -- # bdev_size=103424 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- common/autotest_common.sh@1392 -- # echo 103424 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- ftl/restore.sh@48 -- # l2p_dram_size_mb=10 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- ftl/restore.sh@49 -- # ftl_construct_args='bdev_ftl_create -b ftl0 -d af91b8b9-d134-4abb-8137-54bd2f4af465 --l2p_dram_limit 10' 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- ftl/restore.sh@51 -- # '[' -n '' ']' 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- ftl/restore.sh@52 -- # '[' -n 0000:00:10.0 ']' 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- ftl/restore.sh@52 -- # ftl_construct_args+=' -c nvc0n1p0' 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- ftl/restore.sh@54 -- # '[' 1 -eq 1 ']' 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- ftl/restore.sh@55 -- # ftl_construct_args+=' --fast-shutdown' 00:29:31.935 05:19:51 ftl.ftl_restore_fast -- ftl/restore.sh@58 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py -t 240 bdev_ftl_create -b ftl0 -d af91b8b9-d134-4abb-8137-54bd2f4af465 --l2p_dram_limit 10 -c nvc0n1p0 --fast-shutdown 00:29:32.195 [2024-12-15 05:19:52.095221] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:32.195 [2024-12-15 05:19:52.095259] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:29:32.195 [2024-12-15 05:19:52.095269] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:29:32.195 [2024-12-15 05:19:52.095277] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:32.195 [2024-12-15 05:19:52.095315] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:32.195 [2024-12-15 05:19:52.095325] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:29:32.195 [2024-12-15 05:19:52.095332] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.026 ms 00:29:32.195 [2024-12-15 05:19:52.095341] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:32.195 [2024-12-15 05:19:52.095361] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:29:32.195 [2024-12-15 05:19:52.095678] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:29:32.195 [2024-12-15 05:19:52.095723] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:32.195 [2024-12-15 05:19:52.095741] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:29:32.195 [2024-12-15 05:19:52.095757] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.369 ms 00:29:32.195 [2024-12-15 05:19:52.095773] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:32.195 [2024-12-15 05:19:52.095831] mngt/ftl_mngt_md.c: 570:ftl_mngt_superblock_init: *NOTICE*: [FTL][ftl0] Create new FTL, UUID 54c73827-c7ab-4552-9a64-af9833daf145 00:29:32.195 [2024-12-15 05:19:52.096846] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:32.195 [2024-12-15 05:19:52.096935] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Default-initialize superblock 00:29:32.195 [2024-12-15 05:19:52.096950] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.020 ms 00:29:32.195 [2024-12-15 05:19:52.096956] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:32.195 [2024-12-15 05:19:52.101718] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:32.195 [2024-12-15 05:19:52.101745] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:29:32.195 [2024-12-15 05:19:52.101754] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.728 ms 00:29:32.195 [2024-12-15 05:19:52.101760] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:32.195 [2024-12-15 05:19:52.101820] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:32.195 [2024-12-15 05:19:52.101827] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:29:32.195 [2024-12-15 05:19:52.101835] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.040 ms 00:29:32.195 [2024-12-15 05:19:52.101840] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:32.195 [2024-12-15 05:19:52.101875] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:32.195 [2024-12-15 05:19:52.101883] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:29:32.195 [2024-12-15 05:19:52.101891] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.008 ms 00:29:32.195 [2024-12-15 05:19:52.101896] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:32.195 [2024-12-15 05:19:52.101914] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:29:32.195 [2024-12-15 05:19:52.103248] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:32.195 [2024-12-15 05:19:52.103269] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:29:32.195 [2024-12-15 05:19:52.103276] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.339 ms 00:29:32.195 [2024-12-15 05:19:52.103284] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:32.195 [2024-12-15 05:19:52.103309] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:32.195 [2024-12-15 05:19:52.103317] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:29:32.195 [2024-12-15 05:19:52.103324] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.008 ms 00:29:32.195 [2024-12-15 05:19:52.103333] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:32.195 [2024-12-15 05:19:52.103354] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 1 00:29:32.195 [2024-12-15 05:19:52.103488] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:29:32.195 [2024-12-15 05:19:52.103503] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:29:32.195 [2024-12-15 05:19:52.103513] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:29:32.195 [2024-12-15 05:19:52.103521] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:29:32.195 [2024-12-15 05:19:52.103532] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:29:32.195 [2024-12-15 05:19:52.103539] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:29:32.195 [2024-12-15 05:19:52.103548] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:29:32.195 [2024-12-15 05:19:52.103553] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:29:32.195 [2024-12-15 05:19:52.103560] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:29:32.195 [2024-12-15 05:19:52.103566] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:32.195 [2024-12-15 05:19:52.103573] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:29:32.195 [2024-12-15 05:19:52.103580] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.214 ms 00:29:32.195 [2024-12-15 05:19:52.103586] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:32.195 [2024-12-15 05:19:52.103652] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:32.195 [2024-12-15 05:19:52.103662] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:29:32.195 [2024-12-15 05:19:52.103667] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.053 ms 00:29:32.195 [2024-12-15 05:19:52.103676] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:32.195 [2024-12-15 05:19:52.103748] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:29:32.195 [2024-12-15 05:19:52.103757] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:29:32.195 [2024-12-15 05:19:52.103763] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:29:32.195 [2024-12-15 05:19:52.103771] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:29:32.195 [2024-12-15 05:19:52.103777] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:29:32.195 [2024-12-15 05:19:52.103783] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:29:32.195 [2024-12-15 05:19:52.103789] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:29:32.195 [2024-12-15 05:19:52.103795] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:29:32.195 [2024-12-15 05:19:52.103801] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:29:32.195 [2024-12-15 05:19:52.103808] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:29:32.195 [2024-12-15 05:19:52.103813] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:29:32.195 [2024-12-15 05:19:52.103820] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:29:32.195 [2024-12-15 05:19:52.103825] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:29:32.195 [2024-12-15 05:19:52.103834] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:29:32.195 [2024-12-15 05:19:52.103840] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:29:32.195 [2024-12-15 05:19:52.103846] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:29:32.195 [2024-12-15 05:19:52.103851] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:29:32.195 [2024-12-15 05:19:52.103857] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:29:32.195 [2024-12-15 05:19:52.103862] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:29:32.195 [2024-12-15 05:19:52.103870] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:29:32.195 [2024-12-15 05:19:52.103875] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:29:32.195 [2024-12-15 05:19:52.103882] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:29:32.195 [2024-12-15 05:19:52.103887] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:29:32.195 [2024-12-15 05:19:52.103893] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:29:32.195 [2024-12-15 05:19:52.103899] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:29:32.195 [2024-12-15 05:19:52.103907] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:29:32.195 [2024-12-15 05:19:52.103912] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:29:32.195 [2024-12-15 05:19:52.103919] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:29:32.195 [2024-12-15 05:19:52.103925] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:29:32.195 [2024-12-15 05:19:52.103934] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:29:32.195 [2024-12-15 05:19:52.103940] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:29:32.195 [2024-12-15 05:19:52.103947] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:29:32.195 [2024-12-15 05:19:52.103952] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:29:32.195 [2024-12-15 05:19:52.103959] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:29:32.195 [2024-12-15 05:19:52.103965] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:29:32.195 [2024-12-15 05:19:52.103972] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:29:32.195 [2024-12-15 05:19:52.103978] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:29:32.195 [2024-12-15 05:19:52.103985] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:29:32.195 [2024-12-15 05:19:52.103990] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:29:32.195 [2024-12-15 05:19:52.103997] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:29:32.195 [2024-12-15 05:19:52.104002] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:29:32.195 [2024-12-15 05:19:52.104009] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:29:32.195 [2024-12-15 05:19:52.104015] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:29:32.196 [2024-12-15 05:19:52.104021] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:29:32.196 [2024-12-15 05:19:52.104028] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:29:32.196 [2024-12-15 05:19:52.104036] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:29:32.196 [2024-12-15 05:19:52.104043] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:29:32.196 [2024-12-15 05:19:52.104051] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:29:32.196 [2024-12-15 05:19:52.104057] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:29:32.196 [2024-12-15 05:19:52.104064] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:29:32.196 [2024-12-15 05:19:52.104070] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:29:32.196 [2024-12-15 05:19:52.104077] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:29:32.196 [2024-12-15 05:19:52.104084] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:29:32.196 [2024-12-15 05:19:52.104092] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:29:32.196 [2024-12-15 05:19:52.104106] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:29:32.196 [2024-12-15 05:19:52.104117] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:29:32.196 [2024-12-15 05:19:52.104123] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:29:32.196 [2024-12-15 05:19:52.104131] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:29:32.196 [2024-12-15 05:19:52.104137] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:29:32.196 [2024-12-15 05:19:52.104145] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:29:32.196 [2024-12-15 05:19:52.104151] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:29:32.196 [2024-12-15 05:19:52.104160] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:29:32.196 [2024-12-15 05:19:52.104166] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:29:32.196 [2024-12-15 05:19:52.104173] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:29:32.196 [2024-12-15 05:19:52.104179] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:29:32.196 [2024-12-15 05:19:52.104187] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:29:32.196 [2024-12-15 05:19:52.104193] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:29:32.196 [2024-12-15 05:19:52.104200] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:29:32.196 [2024-12-15 05:19:52.104216] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:29:32.196 [2024-12-15 05:19:52.104224] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:29:32.196 [2024-12-15 05:19:52.104231] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:29:32.196 [2024-12-15 05:19:52.104243] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:29:32.196 [2024-12-15 05:19:52.104250] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:29:32.196 [2024-12-15 05:19:52.104257] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:29:32.196 [2024-12-15 05:19:52.104263] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:29:32.196 [2024-12-15 05:19:52.104270] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:32.196 [2024-12-15 05:19:52.104277] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:29:32.196 [2024-12-15 05:19:52.104288] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.572 ms 00:29:32.196 [2024-12-15 05:19:52.104293] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:32.196 [2024-12-15 05:19:52.104327] mngt/ftl_mngt_misc.c: 165:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl0] NV cache data region needs scrubbing, this may take a while. 00:29:32.196 [2024-12-15 05:19:52.104335] mngt/ftl_mngt_misc.c: 166:ftl_mngt_scrub_nv_cache: *NOTICE*: [FTL][ftl0] Scrubbing 5 chunks 00:29:36.406 [2024-12-15 05:19:55.966967] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:55.967059] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Scrub NV cache 00:29:36.406 [2024-12-15 05:19:55.967080] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3862.616 ms 00:29:36.406 [2024-12-15 05:19:55.967090] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:55.980628] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:55.980687] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:29:36.406 [2024-12-15 05:19:55.980705] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 13.416 ms 00:29:36.406 [2024-12-15 05:19:55.980715] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:55.980846] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:55.980859] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:29:36.406 [2024-12-15 05:19:55.980871] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.072 ms 00:29:36.406 [2024-12-15 05:19:55.980879] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:55.993536] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:55.993585] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:29:36.406 [2024-12-15 05:19:55.993600] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 12.605 ms 00:29:36.406 [2024-12-15 05:19:55.993613] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:55.993649] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:55.993658] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:29:36.406 [2024-12-15 05:19:55.993669] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:29:36.406 [2024-12-15 05:19:55.993677] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:55.994212] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:55.994237] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:29:36.406 [2024-12-15 05:19:55.994252] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.478 ms 00:29:36.406 [2024-12-15 05:19:55.994262] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:55.994391] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:55.994414] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:29:36.406 [2024-12-15 05:19:55.994427] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.097 ms 00:29:36.406 [2024-12-15 05:19:55.994485] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:56.003211] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:56.003256] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:29:36.406 [2024-12-15 05:19:56.003269] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.699 ms 00:29:36.406 [2024-12-15 05:19:56.003279] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:56.026026] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 9 (of 10) MiB 00:29:36.406 [2024-12-15 05:19:56.030514] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:56.030574] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:29:36.406 [2024-12-15 05:19:56.030594] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 27.151 ms 00:29:36.406 [2024-12-15 05:19:56.030620] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:56.131150] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:56.131214] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear L2P 00:29:36.406 [2024-12-15 05:19:56.131231] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 100.475 ms 00:29:36.406 [2024-12-15 05:19:56.131245] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:56.131476] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:56.131494] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:29:36.406 [2024-12-15 05:19:56.131504] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.183 ms 00:29:36.406 [2024-12-15 05:19:56.131515] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:56.137309] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:56.137366] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Save initial band info metadata 00:29:36.406 [2024-12-15 05:19:56.137382] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.742 ms 00:29:36.406 [2024-12-15 05:19:56.137394] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:56.142272] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:56.142326] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Save initial chunk info metadata 00:29:36.406 [2024-12-15 05:19:56.142338] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.818 ms 00:29:36.406 [2024-12-15 05:19:56.142349] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:56.142728] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:56.142746] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:29:36.406 [2024-12-15 05:19:56.142756] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.333 ms 00:29:36.406 [2024-12-15 05:19:56.142768] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:56.195007] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:56.195306] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Wipe P2L region 00:29:36.406 [2024-12-15 05:19:56.195336] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 52.217 ms 00:29:36.406 [2024-12-15 05:19:56.195347] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:56.202499] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:56.202552] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear trim map 00:29:36.406 [2024-12-15 05:19:56.202565] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 7.027 ms 00:29:36.406 [2024-12-15 05:19:56.202577] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:56.208071] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:56.208124] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Clear trim log 00:29:36.406 [2024-12-15 05:19:56.208136] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.446 ms 00:29:36.406 [2024-12-15 05:19:56.208146] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.406 [2024-12-15 05:19:56.214287] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.406 [2024-12-15 05:19:56.214500] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:29:36.407 [2024-12-15 05:19:56.214520] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.096 ms 00:29:36.407 [2024-12-15 05:19:56.214533] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.407 [2024-12-15 05:19:56.214693] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.407 [2024-12-15 05:19:56.214735] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:29:36.407 [2024-12-15 05:19:56.214746] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.010 ms 00:29:36.407 [2024-12-15 05:19:56.214757] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.407 [2024-12-15 05:19:56.214850] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.407 [2024-12-15 05:19:56.214864] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:29:36.407 [2024-12-15 05:19:56.214873] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.038 ms 00:29:36.407 [2024-12-15 05:19:56.214887] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.407 [2024-12-15 05:19:56.216024] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 4120.327 ms, result 0 00:29:36.407 { 00:29:36.407 "name": "ftl0", 00:29:36.407 "uuid": "54c73827-c7ab-4552-9a64-af9833daf145" 00:29:36.407 } 00:29:36.407 05:19:56 ftl.ftl_restore_fast -- ftl/restore.sh@61 -- # echo '{"subsystems": [' 00:29:36.407 05:19:56 ftl.ftl_restore_fast -- ftl/restore.sh@62 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py save_subsystem_config -n bdev 00:29:36.407 05:19:56 ftl.ftl_restore_fast -- ftl/restore.sh@63 -- # echo ']}' 00:29:36.407 05:19:56 ftl.ftl_restore_fast -- ftl/restore.sh@65 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_ftl_unload -b ftl0 00:29:36.670 [2024-12-15 05:19:56.663348] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.670 [2024-12-15 05:19:56.663399] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:29:36.670 [2024-12-15 05:19:56.663418] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:29:36.670 [2024-12-15 05:19:56.663428] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.670 [2024-12-15 05:19:56.663473] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:29:36.670 [2024-12-15 05:19:56.664172] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.670 [2024-12-15 05:19:56.664232] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:29:36.670 [2024-12-15 05:19:56.664244] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.683 ms 00:29:36.670 [2024-12-15 05:19:56.664262] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.670 [2024-12-15 05:19:56.664537] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.670 [2024-12-15 05:19:56.664553] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:29:36.670 [2024-12-15 05:19:56.664567] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.250 ms 00:29:36.670 [2024-12-15 05:19:56.664579] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.670 [2024-12-15 05:19:56.667818] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.670 [2024-12-15 05:19:56.667847] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist L2P 00:29:36.670 [2024-12-15 05:19:56.667858] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.223 ms 00:29:36.670 [2024-12-15 05:19:56.667868] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.670 [2024-12-15 05:19:56.674016] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.670 [2024-12-15 05:19:56.674240] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finish L2P trims 00:29:36.670 [2024-12-15 05:19:56.674260] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.129 ms 00:29:36.670 [2024-12-15 05:19:56.674274] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.670 [2024-12-15 05:19:56.677091] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.670 [2024-12-15 05:19:56.677151] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist NV cache metadata 00:29:36.670 [2024-12-15 05:19:56.677162] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.734 ms 00:29:36.670 [2024-12-15 05:19:56.677172] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.670 [2024-12-15 05:19:56.683752] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.670 [2024-12-15 05:19:56.683931] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist valid map metadata 00:29:36.670 [2024-12-15 05:19:56.684003] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.535 ms 00:29:36.670 [2024-12-15 05:19:56.684031] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.670 [2024-12-15 05:19:56.684220] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.670 [2024-12-15 05:19:56.684499] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist P2L metadata 00:29:36.670 [2024-12-15 05:19:56.684517] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.104 ms 00:29:36.670 [2024-12-15 05:19:56.684528] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.670 [2024-12-15 05:19:56.687268] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.670 [2024-12-15 05:19:56.687431] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist band info metadata 00:29:36.670 [2024-12-15 05:19:56.687513] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.712 ms 00:29:36.670 [2024-12-15 05:19:56.687540] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.670 [2024-12-15 05:19:56.689535] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.670 [2024-12-15 05:19:56.689685] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist trim metadata 00:29:36.670 [2024-12-15 05:19:56.689741] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.944 ms 00:29:36.670 [2024-12-15 05:19:56.689767] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.670 [2024-12-15 05:19:56.691424] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.670 [2024-12-15 05:19:56.691586] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Persist superblock 00:29:36.670 [2024-12-15 05:19:56.691639] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.608 ms 00:29:36.670 [2024-12-15 05:19:56.691667] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.670 [2024-12-15 05:19:56.693299] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.670 [2024-12-15 05:19:56.693476] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL clean state 00:29:36.670 [2024-12-15 05:19:56.693538] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.559 ms 00:29:36.670 [2024-12-15 05:19:56.693551] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.670 [2024-12-15 05:19:56.693585] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:29:36.670 [2024-12-15 05:19:56.693603] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693613] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693624] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693633] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693648] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693656] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693667] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693676] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693687] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693695] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693705] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693712] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693721] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693729] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693739] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693747] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693756] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693763] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693773] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693780] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693797] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693806] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693816] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693823] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693833] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693840] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693850] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693857] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693869] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693877] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693888] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693896] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693905] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693913] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693922] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693930] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693943] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693951] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693959] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693968] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693977] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693985] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.693994] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.694002] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.694011] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.694020] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.694031] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.694038] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:29:36.670 [2024-12-15 05:19:56.694047] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694054] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694063] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694071] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694083] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694092] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694100] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694108] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694117] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694125] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694143] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694150] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694159] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694168] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694180] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694188] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694197] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694205] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694214] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694223] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694235] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694242] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694253] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694260] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694272] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694279] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694288] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694295] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694304] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694315] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694325] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694332] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694341] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694348] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694357] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694366] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694378] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694385] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694394] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694401] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694411] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694419] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694429] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694465] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694475] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694484] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694494] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694504] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694514] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694522] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694533] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694541] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:29:36.671 [2024-12-15 05:19:56.694563] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:29:36.671 [2024-12-15 05:19:56.694572] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 54c73827-c7ab-4552-9a64-af9833daf145 00:29:36.671 [2024-12-15 05:19:56.694583] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:29:36.671 [2024-12-15 05:19:56.694592] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 960 00:29:36.671 [2024-12-15 05:19:56.694602] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:29:36.671 [2024-12-15 05:19:56.694610] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:29:36.671 [2024-12-15 05:19:56.694620] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:29:36.671 [2024-12-15 05:19:56.694631] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:29:36.671 [2024-12-15 05:19:56.694642] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:29:36.671 [2024-12-15 05:19:56.694648] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:29:36.671 [2024-12-15 05:19:56.694657] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:29:36.671 [2024-12-15 05:19:56.694664] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.671 [2024-12-15 05:19:56.694674] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:29:36.671 [2024-12-15 05:19:56.694683] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.080 ms 00:29:36.671 [2024-12-15 05:19:56.694693] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.671 [2024-12-15 05:19:56.696834] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.671 [2024-12-15 05:19:56.696999] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:29:36.671 [2024-12-15 05:19:56.697016] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.120 ms 00:29:36.671 [2024-12-15 05:19:56.697030] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.671 [2024-12-15 05:19:56.697181] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:36.671 [2024-12-15 05:19:56.697196] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:29:36.671 [2024-12-15 05:19:56.697207] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.096 ms 00:29:36.671 [2024-12-15 05:19:56.697221] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.671 [2024-12-15 05:19:56.704990] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:29:36.671 [2024-12-15 05:19:56.705045] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:29:36.671 [2024-12-15 05:19:56.705059] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:29:36.671 [2024-12-15 05:19:56.705071] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.671 [2024-12-15 05:19:56.705129] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:29:36.671 [2024-12-15 05:19:56.705140] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:29:36.671 [2024-12-15 05:19:56.705148] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:29:36.671 [2024-12-15 05:19:56.705158] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.671 [2024-12-15 05:19:56.705232] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:29:36.671 [2024-12-15 05:19:56.705251] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:29:36.671 [2024-12-15 05:19:56.705259] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:29:36.671 [2024-12-15 05:19:56.705272] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.671 [2024-12-15 05:19:56.705289] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:29:36.671 [2024-12-15 05:19:56.705305] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:29:36.671 [2024-12-15 05:19:56.705313] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:29:36.671 [2024-12-15 05:19:56.705323] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.671 [2024-12-15 05:19:56.719355] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:29:36.671 [2024-12-15 05:19:56.719417] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:29:36.671 [2024-12-15 05:19:56.719464] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:29:36.671 [2024-12-15 05:19:56.719476] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.671 [2024-12-15 05:19:56.730917] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:29:36.671 [2024-12-15 05:19:56.730978] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:29:36.671 [2024-12-15 05:19:56.730989] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:29:36.671 [2024-12-15 05:19:56.731001] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.671 [2024-12-15 05:19:56.731077] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:29:36.671 [2024-12-15 05:19:56.731094] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:29:36.672 [2024-12-15 05:19:56.731102] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:29:36.672 [2024-12-15 05:19:56.731113] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.672 [2024-12-15 05:19:56.731165] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:29:36.672 [2024-12-15 05:19:56.731178] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:29:36.672 [2024-12-15 05:19:56.731189] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:29:36.672 [2024-12-15 05:19:56.731199] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.672 [2024-12-15 05:19:56.731273] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:29:36.672 [2024-12-15 05:19:56.731287] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:29:36.672 [2024-12-15 05:19:56.731296] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:29:36.672 [2024-12-15 05:19:56.731307] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.672 [2024-12-15 05:19:56.731341] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:29:36.672 [2024-12-15 05:19:56.731360] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:29:36.672 [2024-12-15 05:19:56.731369] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:29:36.672 [2024-12-15 05:19:56.731380] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.672 [2024-12-15 05:19:56.731426] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:29:36.672 [2024-12-15 05:19:56.731475] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:29:36.672 [2024-12-15 05:19:56.731485] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:29:36.672 [2024-12-15 05:19:56.731497] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.672 [2024-12-15 05:19:56.731552] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:29:36.672 [2024-12-15 05:19:56.731567] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:29:36.672 [2024-12-15 05:19:56.731577] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:29:36.672 [2024-12-15 05:19:56.731588] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:36.672 [2024-12-15 05:19:56.731741] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL shutdown', duration = 68.341 ms, result 0 00:29:36.672 true 00:29:36.672 05:19:56 ftl.ftl_restore_fast -- ftl/restore.sh@66 -- # killprocess 96602 00:29:36.672 05:19:56 ftl.ftl_restore_fast -- common/autotest_common.sh@954 -- # '[' -z 96602 ']' 00:29:36.672 05:19:56 ftl.ftl_restore_fast -- common/autotest_common.sh@958 -- # kill -0 96602 00:29:36.672 05:19:56 ftl.ftl_restore_fast -- common/autotest_common.sh@959 -- # uname 00:29:36.672 05:19:56 ftl.ftl_restore_fast -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:29:36.672 05:19:56 ftl.ftl_restore_fast -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 96602 00:29:36.672 killing process with pid 96602 00:29:36.672 05:19:56 ftl.ftl_restore_fast -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:29:36.672 05:19:56 ftl.ftl_restore_fast -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:29:36.672 05:19:56 ftl.ftl_restore_fast -- common/autotest_common.sh@972 -- # echo 'killing process with pid 96602' 00:29:36.672 05:19:56 ftl.ftl_restore_fast -- common/autotest_common.sh@973 -- # kill 96602 00:29:36.672 05:19:56 ftl.ftl_restore_fast -- common/autotest_common.sh@978 -- # wait 96602 00:29:41.967 05:20:01 ftl.ftl_restore_fast -- ftl/restore.sh@69 -- # dd if=/dev/urandom of=/home/vagrant/spdk_repo/spdk/test/ftl/testfile bs=4K count=256K 00:29:45.268 262144+0 records in 00:29:45.268 262144+0 records out 00:29:45.268 1073741824 bytes (1.1 GB, 1.0 GiB) copied, 3.44764 s, 311 MB/s 00:29:45.268 05:20:04 ftl.ftl_restore_fast -- ftl/restore.sh@70 -- # md5sum /home/vagrant/spdk_repo/spdk/test/ftl/testfile 00:29:47.185 05:20:06 ftl.ftl_restore_fast -- ftl/restore.sh@73 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/ftl/testfile --ob=ftl0 --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:29:47.185 [2024-12-15 05:20:06.983141] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:29:47.185 [2024-12-15 05:20:06.983272] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid96805 ] 00:29:47.185 [2024-12-15 05:20:07.143544] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:47.185 [2024-12-15 05:20:07.168669] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:29:47.185 [2024-12-15 05:20:07.281524] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:29:47.185 [2024-12-15 05:20:07.281609] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:29:47.448 [2024-12-15 05:20:07.444008] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.448 [2024-12-15 05:20:07.444068] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:29:47.448 [2024-12-15 05:20:07.444084] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:29:47.448 [2024-12-15 05:20:07.444093] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.448 [2024-12-15 05:20:07.444151] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.448 [2024-12-15 05:20:07.444163] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:29:47.448 [2024-12-15 05:20:07.444177] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.038 ms 00:29:47.448 [2024-12-15 05:20:07.444185] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.448 [2024-12-15 05:20:07.444225] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:29:47.448 [2024-12-15 05:20:07.444543] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:29:47.448 [2024-12-15 05:20:07.444568] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.448 [2024-12-15 05:20:07.444583] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:29:47.448 [2024-12-15 05:20:07.444596] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.348 ms 00:29:47.448 [2024-12-15 05:20:07.444606] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.448 [2024-12-15 05:20:07.446301] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 0, shm_clean 0 00:29:47.448 [2024-12-15 05:20:07.450094] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.448 [2024-12-15 05:20:07.450346] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:29:47.448 [2024-12-15 05:20:07.450377] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.794 ms 00:29:47.448 [2024-12-15 05:20:07.450390] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.448 [2024-12-15 05:20:07.450611] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.448 [2024-12-15 05:20:07.450646] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:29:47.448 [2024-12-15 05:20:07.450658] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.034 ms 00:29:47.448 [2024-12-15 05:20:07.450671] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.448 [2024-12-15 05:20:07.458929] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.448 [2024-12-15 05:20:07.458975] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:29:47.448 [2024-12-15 05:20:07.458992] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.208 ms 00:29:47.448 [2024-12-15 05:20:07.459005] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.448 [2024-12-15 05:20:07.459107] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.448 [2024-12-15 05:20:07.459117] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:29:47.448 [2024-12-15 05:20:07.459126] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.080 ms 00:29:47.448 [2024-12-15 05:20:07.459135] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.448 [2024-12-15 05:20:07.459200] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.448 [2024-12-15 05:20:07.459217] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:29:47.448 [2024-12-15 05:20:07.459226] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.008 ms 00:29:47.448 [2024-12-15 05:20:07.459239] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.448 [2024-12-15 05:20:07.459262] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:29:47.448 [2024-12-15 05:20:07.461328] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.448 [2024-12-15 05:20:07.461595] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:29:47.448 [2024-12-15 05:20:07.461614] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.071 ms 00:29:47.448 [2024-12-15 05:20:07.461623] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.448 [2024-12-15 05:20:07.461670] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.448 [2024-12-15 05:20:07.461683] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:29:47.448 [2024-12-15 05:20:07.461694] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.014 ms 00:29:47.448 [2024-12-15 05:20:07.461706] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.448 [2024-12-15 05:20:07.461733] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:29:47.448 [2024-12-15 05:20:07.461759] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:29:47.448 [2024-12-15 05:20:07.461798] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:29:47.448 [2024-12-15 05:20:07.461819] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:29:47.448 [2024-12-15 05:20:07.461926] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:29:47.448 [2024-12-15 05:20:07.461944] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:29:47.448 [2024-12-15 05:20:07.461959] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:29:47.448 [2024-12-15 05:20:07.461972] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:29:47.448 [2024-12-15 05:20:07.461982] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:29:47.448 [2024-12-15 05:20:07.461990] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:29:47.448 [2024-12-15 05:20:07.461999] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:29:47.448 [2024-12-15 05:20:07.462013] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:29:47.448 [2024-12-15 05:20:07.462022] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:29:47.448 [2024-12-15 05:20:07.462031] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.448 [2024-12-15 05:20:07.462039] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:29:47.448 [2024-12-15 05:20:07.462046] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.303 ms 00:29:47.448 [2024-12-15 05:20:07.462058] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.448 [2024-12-15 05:20:07.462146] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.448 [2024-12-15 05:20:07.462158] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:29:47.448 [2024-12-15 05:20:07.462171] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.070 ms 00:29:47.448 [2024-12-15 05:20:07.462180] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.448 [2024-12-15 05:20:07.462279] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:29:47.449 [2024-12-15 05:20:07.462292] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:29:47.449 [2024-12-15 05:20:07.462302] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:29:47.449 [2024-12-15 05:20:07.462318] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:29:47.449 [2024-12-15 05:20:07.462327] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:29:47.449 [2024-12-15 05:20:07.462338] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:29:47.449 [2024-12-15 05:20:07.462348] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:29:47.449 [2024-12-15 05:20:07.462356] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:29:47.449 [2024-12-15 05:20:07.462365] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:29:47.449 [2024-12-15 05:20:07.462373] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:29:47.449 [2024-12-15 05:20:07.462381] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:29:47.449 [2024-12-15 05:20:07.462393] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:29:47.449 [2024-12-15 05:20:07.462402] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:29:47.449 [2024-12-15 05:20:07.462411] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:29:47.449 [2024-12-15 05:20:07.462420] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:29:47.449 [2024-12-15 05:20:07.462428] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:29:47.449 [2024-12-15 05:20:07.462453] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:29:47.449 [2024-12-15 05:20:07.462460] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:29:47.449 [2024-12-15 05:20:07.462467] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:29:47.449 [2024-12-15 05:20:07.462474] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:29:47.449 [2024-12-15 05:20:07.462481] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:29:47.449 [2024-12-15 05:20:07.462490] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:29:47.449 [2024-12-15 05:20:07.462500] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:29:47.449 [2024-12-15 05:20:07.462507] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:29:47.449 [2024-12-15 05:20:07.462514] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:29:47.449 [2024-12-15 05:20:07.462523] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:29:47.449 [2024-12-15 05:20:07.462531] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:29:47.449 [2024-12-15 05:20:07.462548] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:29:47.449 [2024-12-15 05:20:07.462562] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:29:47.449 [2024-12-15 05:20:07.462569] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:29:47.449 [2024-12-15 05:20:07.462576] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:29:47.449 [2024-12-15 05:20:07.462582] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:29:47.449 [2024-12-15 05:20:07.462590] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:29:47.449 [2024-12-15 05:20:07.462597] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:29:47.449 [2024-12-15 05:20:07.462603] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:29:47.449 [2024-12-15 05:20:07.462609] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:29:47.449 [2024-12-15 05:20:07.462616] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:29:47.449 [2024-12-15 05:20:07.462622] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:29:47.449 [2024-12-15 05:20:07.462629] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:29:47.449 [2024-12-15 05:20:07.462639] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:29:47.449 [2024-12-15 05:20:07.462645] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:29:47.449 [2024-12-15 05:20:07.462652] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:29:47.449 [2024-12-15 05:20:07.462658] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:29:47.449 [2024-12-15 05:20:07.462668] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:29:47.449 [2024-12-15 05:20:07.462681] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:29:47.449 [2024-12-15 05:20:07.462688] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:29:47.449 [2024-12-15 05:20:07.462696] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:29:47.449 [2024-12-15 05:20:07.462703] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:29:47.449 [2024-12-15 05:20:07.462710] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:29:47.449 [2024-12-15 05:20:07.462718] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:29:47.449 [2024-12-15 05:20:07.462726] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:29:47.449 [2024-12-15 05:20:07.462733] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:29:47.449 [2024-12-15 05:20:07.462739] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:29:47.449 [2024-12-15 05:20:07.462748] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:29:47.449 [2024-12-15 05:20:07.462759] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:29:47.449 [2024-12-15 05:20:07.462773] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:29:47.449 [2024-12-15 05:20:07.462780] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:29:47.449 [2024-12-15 05:20:07.462787] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:29:47.449 [2024-12-15 05:20:07.462794] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:29:47.449 [2024-12-15 05:20:07.462805] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:29:47.449 [2024-12-15 05:20:07.462813] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:29:47.449 [2024-12-15 05:20:07.462820] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:29:47.449 [2024-12-15 05:20:07.462826] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:29:47.449 [2024-12-15 05:20:07.462833] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:29:47.449 [2024-12-15 05:20:07.462842] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:29:47.449 [2024-12-15 05:20:07.462850] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:29:47.449 [2024-12-15 05:20:07.462856] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:29:47.449 [2024-12-15 05:20:07.462863] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:29:47.449 [2024-12-15 05:20:07.462870] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:29:47.449 [2024-12-15 05:20:07.462879] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:29:47.449 [2024-12-15 05:20:07.462890] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:29:47.449 [2024-12-15 05:20:07.462902] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:29:47.449 [2024-12-15 05:20:07.462910] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:29:47.449 [2024-12-15 05:20:07.462918] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:29:47.449 [2024-12-15 05:20:07.462926] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:29:47.449 [2024-12-15 05:20:07.462938] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.449 [2024-12-15 05:20:07.462947] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:29:47.449 [2024-12-15 05:20:07.462955] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.725 ms 00:29:47.449 [2024-12-15 05:20:07.462965] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.449 [2024-12-15 05:20:07.477148] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.449 [2024-12-15 05:20:07.477364] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:29:47.449 [2024-12-15 05:20:07.477383] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 14.134 ms 00:29:47.449 [2024-12-15 05:20:07.477392] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.449 [2024-12-15 05:20:07.477503] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.449 [2024-12-15 05:20:07.477519] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:29:47.449 [2024-12-15 05:20:07.477530] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.083 ms 00:29:47.449 [2024-12-15 05:20:07.477539] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.449 [2024-12-15 05:20:07.497627] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.449 [2024-12-15 05:20:07.497691] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:29:47.449 [2024-12-15 05:20:07.497709] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 20.025 ms 00:29:47.450 [2024-12-15 05:20:07.497720] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.497781] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.497796] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:29:47.450 [2024-12-15 05:20:07.497808] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:29:47.450 [2024-12-15 05:20:07.497820] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.498390] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.498429] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:29:47.450 [2024-12-15 05:20:07.498470] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.486 ms 00:29:47.450 [2024-12-15 05:20:07.498483] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.498687] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.498778] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:29:47.450 [2024-12-15 05:20:07.498802] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.167 ms 00:29:47.450 [2024-12-15 05:20:07.498818] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.507245] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.507301] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:29:47.450 [2024-12-15 05:20:07.507316] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.390 ms 00:29:47.450 [2024-12-15 05:20:07.507327] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.511410] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 0, empty chunks = 4 00:29:47.450 [2024-12-15 05:20:07.511488] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:29:47.450 [2024-12-15 05:20:07.511501] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.511510] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:29:47.450 [2024-12-15 05:20:07.511519] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.028 ms 00:29:47.450 [2024-12-15 05:20:07.511528] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.527743] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.527814] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:29:47.450 [2024-12-15 05:20:07.527833] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 16.087 ms 00:29:47.450 [2024-12-15 05:20:07.527842] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.531021] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.531069] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:29:47.450 [2024-12-15 05:20:07.531079] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 3.122 ms 00:29:47.450 [2024-12-15 05:20:07.531087] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.533964] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.534184] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:29:47.450 [2024-12-15 05:20:07.534203] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.830 ms 00:29:47.450 [2024-12-15 05:20:07.534212] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.534595] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.534613] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:29:47.450 [2024-12-15 05:20:07.534624] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.306 ms 00:29:47.450 [2024-12-15 05:20:07.534632] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.559347] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.559409] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:29:47.450 [2024-12-15 05:20:07.559421] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 24.695 ms 00:29:47.450 [2024-12-15 05:20:07.559429] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.567335] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 9 (of 10) MiB 00:29:47.450 [2024-12-15 05:20:07.570531] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.570577] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:29:47.450 [2024-12-15 05:20:07.570590] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 11.013 ms 00:29:47.450 [2024-12-15 05:20:07.570602] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.570684] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.570696] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:29:47.450 [2024-12-15 05:20:07.570710] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.013 ms 00:29:47.450 [2024-12-15 05:20:07.570719] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.570786] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.570798] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:29:47.450 [2024-12-15 05:20:07.570810] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.032 ms 00:29:47.450 [2024-12-15 05:20:07.570818] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.570839] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.570848] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:29:47.450 [2024-12-15 05:20:07.570858] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:29:47.450 [2024-12-15 05:20:07.570866] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.570907] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:29:47.450 [2024-12-15 05:20:07.570920] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.570928] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:29:47.450 [2024-12-15 05:20:07.570936] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.016 ms 00:29:47.450 [2024-12-15 05:20:07.570947] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.576235] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.576278] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:29:47.450 [2024-12-15 05:20:07.576289] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.270 ms 00:29:47.450 [2024-12-15 05:20:07.576297] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.576374] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:29:47.450 [2024-12-15 05:20:07.576384] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:29:47.450 [2024-12-15 05:20:07.576393] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.036 ms 00:29:47.450 [2024-12-15 05:20:07.576408] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:29:47.450 [2024-12-15 05:20:07.577521] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 133.050 ms, result 0 00:29:48.840  [2024-12-15T05:20:09.924Z] Copying: 15/1024 [MB] (15 MBps) [2024-12-15T05:20:10.871Z] Copying: 30/1024 [MB] (14 MBps) [2024-12-15T05:20:11.817Z] Copying: 54/1024 [MB] (24 MBps) [2024-12-15T05:20:12.762Z] Copying: 68/1024 [MB] (13 MBps) [2024-12-15T05:20:13.706Z] Copying: 89/1024 [MB] (20 MBps) [2024-12-15T05:20:14.652Z] Copying: 107/1024 [MB] (18 MBps) [2024-12-15T05:20:15.595Z] Copying: 122/1024 [MB] (15 MBps) [2024-12-15T05:20:16.983Z] Copying: 138/1024 [MB] (15 MBps) [2024-12-15T05:20:17.971Z] Copying: 154/1024 [MB] (16 MBps) [2024-12-15T05:20:18.936Z] Copying: 171/1024 [MB] (16 MBps) [2024-12-15T05:20:19.880Z] Copying: 187/1024 [MB] (15 MBps) [2024-12-15T05:20:20.824Z] Copying: 201/1024 [MB] (14 MBps) [2024-12-15T05:20:21.769Z] Copying: 214/1024 [MB] (12 MBps) [2024-12-15T05:20:22.713Z] Copying: 229964/1048576 [kB] (10172 kBps) [2024-12-15T05:20:23.658Z] Copying: 234/1024 [MB] (10 MBps) [2024-12-15T05:20:24.602Z] Copying: 249/1024 [MB] (14 MBps) [2024-12-15T05:20:25.989Z] Copying: 265/1024 [MB] (15 MBps) [2024-12-15T05:20:26.933Z] Copying: 280/1024 [MB] (15 MBps) [2024-12-15T05:20:27.879Z] Copying: 294/1024 [MB] (14 MBps) [2024-12-15T05:20:28.824Z] Copying: 305/1024 [MB] (10 MBps) [2024-12-15T05:20:29.768Z] Copying: 322/1024 [MB] (16 MBps) [2024-12-15T05:20:30.711Z] Copying: 336/1024 [MB] (13 MBps) [2024-12-15T05:20:31.655Z] Copying: 350/1024 [MB] (14 MBps) [2024-12-15T05:20:32.600Z] Copying: 365/1024 [MB] (14 MBps) [2024-12-15T05:20:33.988Z] Copying: 379/1024 [MB] (14 MBps) [2024-12-15T05:20:34.933Z] Copying: 394/1024 [MB] (14 MBps) [2024-12-15T05:20:35.878Z] Copying: 405/1024 [MB] (11 MBps) [2024-12-15T05:20:36.823Z] Copying: 422/1024 [MB] (16 MBps) [2024-12-15T05:20:37.766Z] Copying: 436/1024 [MB] (13 MBps) [2024-12-15T05:20:38.712Z] Copying: 446/1024 [MB] (10 MBps) [2024-12-15T05:20:39.657Z] Copying: 465/1024 [MB] (19 MBps) [2024-12-15T05:20:40.602Z] Copying: 477/1024 [MB] (12 MBps) [2024-12-15T05:20:41.602Z] Copying: 494/1024 [MB] (17 MBps) [2024-12-15T05:20:42.989Z] Copying: 514/1024 [MB] (19 MBps) [2024-12-15T05:20:43.934Z] Copying: 543/1024 [MB] (28 MBps) [2024-12-15T05:20:44.877Z] Copying: 554/1024 [MB] (11 MBps) [2024-12-15T05:20:45.820Z] Copying: 572/1024 [MB] (17 MBps) [2024-12-15T05:20:46.763Z] Copying: 588/1024 [MB] (16 MBps) [2024-12-15T05:20:47.708Z] Copying: 612/1024 [MB] (23 MBps) [2024-12-15T05:20:48.651Z] Copying: 634/1024 [MB] (22 MBps) [2024-12-15T05:20:49.594Z] Copying: 659/1024 [MB] (24 MBps) [2024-12-15T05:20:50.980Z] Copying: 696/1024 [MB] (37 MBps) [2024-12-15T05:20:51.924Z] Copying: 733/1024 [MB] (36 MBps) [2024-12-15T05:20:52.868Z] Copying: 767/1024 [MB] (33 MBps) [2024-12-15T05:20:53.810Z] Copying: 778/1024 [MB] (11 MBps) [2024-12-15T05:20:54.754Z] Copying: 798/1024 [MB] (20 MBps) [2024-12-15T05:20:55.695Z] Copying: 810/1024 [MB] (11 MBps) [2024-12-15T05:20:56.783Z] Copying: 837/1024 [MB] (27 MBps) [2024-12-15T05:20:57.726Z] Copying: 847/1024 [MB] (10 MBps) [2024-12-15T05:20:58.671Z] Copying: 864/1024 [MB] (16 MBps) [2024-12-15T05:20:59.615Z] Copying: 879/1024 [MB] (14 MBps) [2024-12-15T05:21:01.001Z] Copying: 892/1024 [MB] (13 MBps) [2024-12-15T05:21:01.943Z] Copying: 903/1024 [MB] (10 MBps) [2024-12-15T05:21:02.886Z] Copying: 916/1024 [MB] (12 MBps) [2024-12-15T05:21:03.828Z] Copying: 926/1024 [MB] (10 MBps) [2024-12-15T05:21:04.773Z] Copying: 939/1024 [MB] (12 MBps) [2024-12-15T05:21:05.716Z] Copying: 951/1024 [MB] (12 MBps) [2024-12-15T05:21:06.660Z] Copying: 985/1024 [MB] (34 MBps) [2024-12-15T05:21:06.660Z] Copying: 1024/1024 [MB] (average 17 MBps)[2024-12-15 05:21:06.582502] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:46.520 [2024-12-15 05:21:06.582549] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:30:46.520 [2024-12-15 05:21:06.582562] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:30:46.520 [2024-12-15 05:21:06.582574] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.520 [2024-12-15 05:21:06.582594] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:30:46.520 [2024-12-15 05:21:06.583055] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:46.521 [2024-12-15 05:21:06.583072] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:30:46.521 [2024-12-15 05:21:06.583081] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.448 ms 00:30:46.521 [2024-12-15 05:21:06.583088] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.521 [2024-12-15 05:21:06.584934] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:46.521 [2024-12-15 05:21:06.584967] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:30:46.521 [2024-12-15 05:21:06.584976] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.827 ms 00:30:46.521 [2024-12-15 05:21:06.584984] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.521 [2024-12-15 05:21:06.585011] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:46.521 [2024-12-15 05:21:06.585019] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Fast persist NV cache metadata 00:30:46.521 [2024-12-15 05:21:06.585036] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:30:46.521 [2024-12-15 05:21:06.585043] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.521 [2024-12-15 05:21:06.585089] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:46.521 [2024-12-15 05:21:06.585097] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL SHM clean state 00:30:46.521 [2024-12-15 05:21:06.585105] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.015 ms 00:30:46.521 [2024-12-15 05:21:06.585112] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.521 [2024-12-15 05:21:06.585125] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:30:46.521 [2024-12-15 05:21:06.585138] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585147] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585155] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585162] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585170] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585177] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585184] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585191] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585198] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585205] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585219] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585226] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585233] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585240] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585247] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585255] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585262] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585269] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585276] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585283] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585292] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585299] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585308] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585315] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585323] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585330] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585338] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585345] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585352] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585359] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585367] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585374] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585381] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585388] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585395] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585402] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585409] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585417] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585424] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585431] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585625] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585656] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585685] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585756] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585785] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585814] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585842] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585901] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585930] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585958] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.585987] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586044] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586075] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586103] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586132] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586152] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586160] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586167] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586175] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586182] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586190] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586198] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586206] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586213] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586220] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586227] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586235] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586242] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586249] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586256] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586263] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586270] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586277] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586285] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586292] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586299] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:30:46.521 [2024-12-15 05:21:06.586306] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586313] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586320] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586327] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586335] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586342] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586349] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586356] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586363] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586370] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586378] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586385] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586392] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586400] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586407] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586414] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586421] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586429] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586448] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586455] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586462] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586470] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586477] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586484] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:30:46.522 [2024-12-15 05:21:06.586499] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:30:46.522 [2024-12-15 05:21:06.586508] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 54c73827-c7ab-4552-9a64-af9833daf145 00:30:46.522 [2024-12-15 05:21:06.586516] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:30:46.522 [2024-12-15 05:21:06.586523] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 32 00:30:46.522 [2024-12-15 05:21:06.586530] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:30:46.522 [2024-12-15 05:21:06.586543] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:30:46.522 [2024-12-15 05:21:06.586550] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:30:46.522 [2024-12-15 05:21:06.586560] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:30:46.522 [2024-12-15 05:21:06.586567] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:30:46.522 [2024-12-15 05:21:06.586574] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:30:46.522 [2024-12-15 05:21:06.586580] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:30:46.522 [2024-12-15 05:21:06.586587] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:46.522 [2024-12-15 05:21:06.586595] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:30:46.522 [2024-12-15 05:21:06.586606] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.463 ms 00:30:46.522 [2024-12-15 05:21:06.586613] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.588026] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:46.522 [2024-12-15 05:21:06.588049] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:30:46.522 [2024-12-15 05:21:06.588057] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.395 ms 00:30:46.522 [2024-12-15 05:21:06.588070] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.588161] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:46.522 [2024-12-15 05:21:06.588174] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:30:46.522 [2024-12-15 05:21:06.588182] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.061 ms 00:30:46.522 [2024-12-15 05:21:06.588189] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.593205] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:30:46.522 [2024-12-15 05:21:06.593313] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:30:46.522 [2024-12-15 05:21:06.593328] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:30:46.522 [2024-12-15 05:21:06.593336] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.593386] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:30:46.522 [2024-12-15 05:21:06.593402] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:30:46.522 [2024-12-15 05:21:06.593410] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:30:46.522 [2024-12-15 05:21:06.593417] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.593479] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:30:46.522 [2024-12-15 05:21:06.593489] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:30:46.522 [2024-12-15 05:21:06.593497] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:30:46.522 [2024-12-15 05:21:06.593504] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.593518] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:30:46.522 [2024-12-15 05:21:06.593526] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:30:46.522 [2024-12-15 05:21:06.593536] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:30:46.522 [2024-12-15 05:21:06.593548] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.602509] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:30:46.522 [2024-12-15 05:21:06.602546] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:30:46.522 [2024-12-15 05:21:06.602556] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:30:46.522 [2024-12-15 05:21:06.602564] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.609689] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:30:46.522 [2024-12-15 05:21:06.609725] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:30:46.522 [2024-12-15 05:21:06.609741] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:30:46.522 [2024-12-15 05:21:06.609748] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.609793] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:30:46.522 [2024-12-15 05:21:06.609802] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:30:46.522 [2024-12-15 05:21:06.609809] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:30:46.522 [2024-12-15 05:21:06.609817] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.609840] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:30:46.522 [2024-12-15 05:21:06.609848] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:30:46.522 [2024-12-15 05:21:06.609856] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:30:46.522 [2024-12-15 05:21:06.609866] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.609916] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:30:46.522 [2024-12-15 05:21:06.609925] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:30:46.522 [2024-12-15 05:21:06.609933] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:30:46.522 [2024-12-15 05:21:06.609940] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.609964] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:30:46.522 [2024-12-15 05:21:06.609972] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:30:46.522 [2024-12-15 05:21:06.609980] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:30:46.522 [2024-12-15 05:21:06.609987] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.610023] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:30:46.522 [2024-12-15 05:21:06.610032] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:30:46.522 [2024-12-15 05:21:06.610040] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:30:46.522 [2024-12-15 05:21:06.610047] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.610084] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:30:46.522 [2024-12-15 05:21:06.610093] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:30:46.522 [2024-12-15 05:21:06.610101] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:30:46.522 [2024-12-15 05:21:06.610110] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:46.522 [2024-12-15 05:21:06.610221] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL fast shutdown', duration = 27.694 ms, result 0 00:30:46.784 00:30:46.784 00:30:46.784 05:21:06 ftl.ftl_restore_fast -- ftl/restore.sh@74 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=ftl0 --of=/home/vagrant/spdk_repo/spdk/test/ftl/testfile --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json --count=262144 00:30:46.784 [2024-12-15 05:21:06.874000] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:30:46.784 [2024-12-15 05:21:06.874151] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid97412 ] 00:30:47.045 [2024-12-15 05:21:07.038316] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:47.045 [2024-12-15 05:21:07.065936] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:30:47.045 [2024-12-15 05:21:07.177056] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:30:47.045 [2024-12-15 05:21:07.177388] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:30:47.308 [2024-12-15 05:21:07.339156] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.308 [2024-12-15 05:21:07.339364] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:30:47.308 [2024-12-15 05:21:07.339390] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.006 ms 00:30:47.308 [2024-12-15 05:21:07.339400] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.308 [2024-12-15 05:21:07.339508] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.308 [2024-12-15 05:21:07.339526] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:30:47.308 [2024-12-15 05:21:07.339535] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.080 ms 00:30:47.308 [2024-12-15 05:21:07.339543] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.308 [2024-12-15 05:21:07.339578] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:30:47.308 [2024-12-15 05:21:07.339987] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:30:47.308 [2024-12-15 05:21:07.340021] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.308 [2024-12-15 05:21:07.340035] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:30:47.308 [2024-12-15 05:21:07.340050] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.455 ms 00:30:47.308 [2024-12-15 05:21:07.340063] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.308 [2024-12-15 05:21:07.340383] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 1, shm_clean 1 00:30:47.308 [2024-12-15 05:21:07.340409] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.308 [2024-12-15 05:21:07.340419] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:30:47.308 [2024-12-15 05:21:07.340429] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.028 ms 00:30:47.308 [2024-12-15 05:21:07.340462] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.308 [2024-12-15 05:21:07.340526] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.308 [2024-12-15 05:21:07.340537] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:30:47.308 [2024-12-15 05:21:07.340546] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.041 ms 00:30:47.308 [2024-12-15 05:21:07.340554] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.308 [2024-12-15 05:21:07.340847] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.308 [2024-12-15 05:21:07.340865] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:30:47.308 [2024-12-15 05:21:07.340874] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.217 ms 00:30:47.308 [2024-12-15 05:21:07.340887] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.308 [2024-12-15 05:21:07.340971] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.308 [2024-12-15 05:21:07.340984] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:30:47.308 [2024-12-15 05:21:07.340993] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.068 ms 00:30:47.308 [2024-12-15 05:21:07.341001] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.308 [2024-12-15 05:21:07.341026] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.308 [2024-12-15 05:21:07.341036] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:30:47.308 [2024-12-15 05:21:07.341045] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.007 ms 00:30:47.308 [2024-12-15 05:21:07.341053] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.308 [2024-12-15 05:21:07.341074] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:30:47.308 [2024-12-15 05:21:07.343229] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.308 [2024-12-15 05:21:07.343284] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:30:47.308 [2024-12-15 05:21:07.343294] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.159 ms 00:30:47.308 [2024-12-15 05:21:07.343302] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.308 [2024-12-15 05:21:07.343343] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.308 [2024-12-15 05:21:07.343355] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:30:47.308 [2024-12-15 05:21:07.343364] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.013 ms 00:30:47.308 [2024-12-15 05:21:07.343374] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.308 [2024-12-15 05:21:07.343459] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:30:47.308 [2024-12-15 05:21:07.343489] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:30:47.308 [2024-12-15 05:21:07.343524] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:30:47.308 [2024-12-15 05:21:07.343545] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:30:47.308 [2024-12-15 05:21:07.343650] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:30:47.308 [2024-12-15 05:21:07.343661] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:30:47.308 [2024-12-15 05:21:07.343672] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:30:47.308 [2024-12-15 05:21:07.343682] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:30:47.308 [2024-12-15 05:21:07.343697] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:30:47.308 [2024-12-15 05:21:07.343706] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:30:47.308 [2024-12-15 05:21:07.343714] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:30:47.308 [2024-12-15 05:21:07.343722] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:30:47.308 [2024-12-15 05:21:07.343729] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:30:47.308 [2024-12-15 05:21:07.343736] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.308 [2024-12-15 05:21:07.343747] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:30:47.308 [2024-12-15 05:21:07.343760] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.306 ms 00:30:47.308 [2024-12-15 05:21:07.343771] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.308 [2024-12-15 05:21:07.343858] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.308 [2024-12-15 05:21:07.343870] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:30:47.308 [2024-12-15 05:21:07.343880] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.069 ms 00:30:47.308 [2024-12-15 05:21:07.343888] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.308 [2024-12-15 05:21:07.343986] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:30:47.308 [2024-12-15 05:21:07.343997] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:30:47.308 [2024-12-15 05:21:07.344012] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:30:47.308 [2024-12-15 05:21:07.344022] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:30:47.308 [2024-12-15 05:21:07.344032] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:30:47.308 [2024-12-15 05:21:07.344045] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:30:47.308 [2024-12-15 05:21:07.344053] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:30:47.308 [2024-12-15 05:21:07.344062] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:30:47.308 [2024-12-15 05:21:07.344070] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:30:47.308 [2024-12-15 05:21:07.344078] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:30:47.308 [2024-12-15 05:21:07.344087] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:30:47.308 [2024-12-15 05:21:07.344094] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:30:47.308 [2024-12-15 05:21:07.344102] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:30:47.308 [2024-12-15 05:21:07.344112] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:30:47.308 [2024-12-15 05:21:07.344121] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:30:47.308 [2024-12-15 05:21:07.344128] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:30:47.308 [2024-12-15 05:21:07.344136] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:30:47.308 [2024-12-15 05:21:07.344144] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:30:47.308 [2024-12-15 05:21:07.344154] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:30:47.309 [2024-12-15 05:21:07.344162] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:30:47.309 [2024-12-15 05:21:07.344170] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:30:47.309 [2024-12-15 05:21:07.344178] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:30:47.309 [2024-12-15 05:21:07.344185] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:30:47.309 [2024-12-15 05:21:07.344193] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:30:47.309 [2024-12-15 05:21:07.344229] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:30:47.309 [2024-12-15 05:21:07.344237] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:30:47.309 [2024-12-15 05:21:07.344245] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:30:47.309 [2024-12-15 05:21:07.344253] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:30:47.309 [2024-12-15 05:21:07.344260] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:30:47.309 [2024-12-15 05:21:07.344267] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:30:47.309 [2024-12-15 05:21:07.344274] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:30:47.309 [2024-12-15 05:21:07.344280] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:30:47.309 [2024-12-15 05:21:07.344287] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:30:47.309 [2024-12-15 05:21:07.344293] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:30:47.309 [2024-12-15 05:21:07.344305] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:30:47.309 [2024-12-15 05:21:07.344312] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:30:47.309 [2024-12-15 05:21:07.344319] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:30:47.309 [2024-12-15 05:21:07.344325] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:30:47.309 [2024-12-15 05:21:07.344332] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:30:47.309 [2024-12-15 05:21:07.344338] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:30:47.309 [2024-12-15 05:21:07.344345] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:30:47.309 [2024-12-15 05:21:07.344351] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:30:47.309 [2024-12-15 05:21:07.344359] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:30:47.309 [2024-12-15 05:21:07.344365] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:30:47.309 [2024-12-15 05:21:07.344374] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:30:47.309 [2024-12-15 05:21:07.344385] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:30:47.309 [2024-12-15 05:21:07.344395] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:30:47.309 [2024-12-15 05:21:07.344407] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:30:47.309 [2024-12-15 05:21:07.344414] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:30:47.309 [2024-12-15 05:21:07.344421] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:30:47.309 [2024-12-15 05:21:07.344446] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:30:47.309 [2024-12-15 05:21:07.344454] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:30:47.309 [2024-12-15 05:21:07.344462] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:30:47.309 [2024-12-15 05:21:07.344471] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:30:47.309 [2024-12-15 05:21:07.344481] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:30:47.309 [2024-12-15 05:21:07.344490] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:30:47.309 [2024-12-15 05:21:07.344498] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:30:47.309 [2024-12-15 05:21:07.344505] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:30:47.309 [2024-12-15 05:21:07.344512] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:30:47.309 [2024-12-15 05:21:07.344520] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:30:47.309 [2024-12-15 05:21:07.344527] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:30:47.309 [2024-12-15 05:21:07.344534] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:30:47.309 [2024-12-15 05:21:07.344541] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:30:47.309 [2024-12-15 05:21:07.344548] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:30:47.309 [2024-12-15 05:21:07.344556] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:30:47.309 [2024-12-15 05:21:07.344563] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:30:47.309 [2024-12-15 05:21:07.344573] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:30:47.309 [2024-12-15 05:21:07.344580] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:30:47.309 [2024-12-15 05:21:07.344588] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:30:47.309 [2024-12-15 05:21:07.344595] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:30:47.309 [2024-12-15 05:21:07.344603] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:30:47.309 [2024-12-15 05:21:07.344613] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:30:47.309 [2024-12-15 05:21:07.344621] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:30:47.309 [2024-12-15 05:21:07.344628] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:30:47.309 [2024-12-15 05:21:07.344635] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:30:47.309 [2024-12-15 05:21:07.344643] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.309 [2024-12-15 05:21:07.344650] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:30:47.309 [2024-12-15 05:21:07.344659] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.725 ms 00:30:47.309 [2024-12-15 05:21:07.344667] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.309 [2024-12-15 05:21:07.354844] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.309 [2024-12-15 05:21:07.355008] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:30:47.309 [2024-12-15 05:21:07.355069] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 10.133 ms 00:30:47.309 [2024-12-15 05:21:07.355104] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.309 [2024-12-15 05:21:07.355205] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.309 [2024-12-15 05:21:07.355228] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:30:47.309 [2024-12-15 05:21:07.355248] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.060 ms 00:30:47.309 [2024-12-15 05:21:07.355268] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.309 [2024-12-15 05:21:07.378061] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.309 [2024-12-15 05:21:07.378265] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:30:47.309 [2024-12-15 05:21:07.378339] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 22.717 ms 00:30:47.309 [2024-12-15 05:21:07.378370] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.309 [2024-12-15 05:21:07.378458] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.309 [2024-12-15 05:21:07.378490] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:30:47.309 [2024-12-15 05:21:07.378525] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.006 ms 00:30:47.309 [2024-12-15 05:21:07.378605] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.309 [2024-12-15 05:21:07.378761] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.309 [2024-12-15 05:21:07.378813] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:30:47.309 [2024-12-15 05:21:07.378943] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.063 ms 00:30:47.309 [2024-12-15 05:21:07.379021] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.309 [2024-12-15 05:21:07.379199] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.309 [2024-12-15 05:21:07.379235] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:30:47.309 [2024-12-15 05:21:07.379265] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.132 ms 00:30:47.309 [2024-12-15 05:21:07.379288] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.309 [2024-12-15 05:21:07.386818] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.309 [2024-12-15 05:21:07.386973] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:30:47.309 [2024-12-15 05:21:07.387040] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 7.493 ms 00:30:47.309 [2024-12-15 05:21:07.387063] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.309 [2024-12-15 05:21:07.387194] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 2, empty chunks = 2 00:30:47.309 [2024-12-15 05:21:07.387234] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:30:47.309 [2024-12-15 05:21:07.387270] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.309 [2024-12-15 05:21:07.387291] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:30:47.309 [2024-12-15 05:21:07.387311] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.100 ms 00:30:47.309 [2024-12-15 05:21:07.387376] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.309 [2024-12-15 05:21:07.399869] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.309 [2024-12-15 05:21:07.400011] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:30:47.309 [2024-12-15 05:21:07.400069] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 12.456 ms 00:30:47.309 [2024-12-15 05:21:07.400098] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.309 [2024-12-15 05:21:07.400253] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.309 [2024-12-15 05:21:07.400278] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:30:47.310 [2024-12-15 05:21:07.400298] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.113 ms 00:30:47.310 [2024-12-15 05:21:07.400323] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.310 [2024-12-15 05:21:07.400494] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.310 [2024-12-15 05:21:07.400545] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:30:47.310 [2024-12-15 05:21:07.400566] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.002 ms 00:30:47.310 [2024-12-15 05:21:07.400584] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.310 [2024-12-15 05:21:07.400908] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.310 [2024-12-15 05:21:07.400951] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:30:47.310 [2024-12-15 05:21:07.400973] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.271 ms 00:30:47.310 [2024-12-15 05:21:07.400996] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.310 [2024-12-15 05:21:07.401103] mngt/ftl_mngt_p2l.c: 169:ftl_mngt_p2l_restore_ckpt: *NOTICE*: [FTL][ftl0] SHM: skipping p2l ckpt restore 00:30:47.310 [2024-12-15 05:21:07.401140] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.310 [2024-12-15 05:21:07.401171] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:30:47.310 [2024-12-15 05:21:07.401192] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.038 ms 00:30:47.310 [2024-12-15 05:21:07.401246] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.310 [2024-12-15 05:21:07.410420] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 9 (of 10) MiB 00:30:47.310 [2024-12-15 05:21:07.410687] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.310 [2024-12-15 05:21:07.410726] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:30:47.310 [2024-12-15 05:21:07.410803] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 9.371 ms 00:30:47.310 [2024-12-15 05:21:07.410830] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.310 [2024-12-15 05:21:07.413311] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.310 [2024-12-15 05:21:07.413449] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:30:47.310 [2024-12-15 05:21:07.413507] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.440 ms 00:30:47.310 [2024-12-15 05:21:07.413530] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.310 [2024-12-15 05:21:07.413648] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.310 [2024-12-15 05:21:07.413675] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:30:47.310 [2024-12-15 05:21:07.413696] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.034 ms 00:30:47.310 [2024-12-15 05:21:07.413718] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.310 [2024-12-15 05:21:07.413809] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.310 [2024-12-15 05:21:07.413834] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:30:47.310 [2024-12-15 05:21:07.413854] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:30:47.310 [2024-12-15 05:21:07.413882] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.310 [2024-12-15 05:21:07.413930] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:30:47.310 [2024-12-15 05:21:07.413957] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.310 [2024-12-15 05:21:07.413978] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:30:47.310 [2024-12-15 05:21:07.414033] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.025 ms 00:30:47.310 [2024-12-15 05:21:07.414055] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.310 [2024-12-15 05:21:07.419898] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.310 [2024-12-15 05:21:07.420055] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:30:47.310 [2024-12-15 05:21:07.420113] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 5.804 ms 00:30:47.310 [2024-12-15 05:21:07.420148] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.310 [2024-12-15 05:21:07.420578] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:30:47.310 [2024-12-15 05:21:07.420737] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:30:47.310 [2024-12-15 05:21:07.420797] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.039 ms 00:30:47.310 [2024-12-15 05:21:07.420850] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:30:47.310 [2024-12-15 05:21:07.422537] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 82.932 ms, result 0 00:30:48.696  [2024-12-15T05:21:09.779Z] Copying: 25/1024 [MB] (25 MBps) [2024-12-15T05:21:10.725Z] Copying: 51/1024 [MB] (25 MBps) [2024-12-15T05:21:11.669Z] Copying: 68/1024 [MB] (16 MBps) [2024-12-15T05:21:12.612Z] Copying: 91/1024 [MB] (23 MBps) [2024-12-15T05:21:13.998Z] Copying: 112/1024 [MB] (21 MBps) [2024-12-15T05:21:14.941Z] Copying: 133/1024 [MB] (21 MBps) [2024-12-15T05:21:15.891Z] Copying: 154/1024 [MB] (20 MBps) [2024-12-15T05:21:16.832Z] Copying: 182/1024 [MB] (28 MBps) [2024-12-15T05:21:17.777Z] Copying: 203/1024 [MB] (20 MBps) [2024-12-15T05:21:18.720Z] Copying: 217/1024 [MB] (14 MBps) [2024-12-15T05:21:19.664Z] Copying: 230/1024 [MB] (12 MBps) [2024-12-15T05:21:21.050Z] Copying: 241/1024 [MB] (10 MBps) [2024-12-15T05:21:21.622Z] Copying: 251/1024 [MB] (10 MBps) [2024-12-15T05:21:23.009Z] Copying: 262/1024 [MB] (10 MBps) [2024-12-15T05:21:23.953Z] Copying: 277/1024 [MB] (14 MBps) [2024-12-15T05:21:24.896Z] Copying: 287/1024 [MB] (10 MBps) [2024-12-15T05:21:25.840Z] Copying: 305/1024 [MB] (17 MBps) [2024-12-15T05:21:26.783Z] Copying: 323/1024 [MB] (17 MBps) [2024-12-15T05:21:27.726Z] Copying: 340/1024 [MB] (16 MBps) [2024-12-15T05:21:28.671Z] Copying: 357/1024 [MB] (17 MBps) [2024-12-15T05:21:29.616Z] Copying: 377/1024 [MB] (19 MBps) [2024-12-15T05:21:31.006Z] Copying: 389/1024 [MB] (11 MBps) [2024-12-15T05:21:31.948Z] Copying: 404/1024 [MB] (15 MBps) [2024-12-15T05:21:32.892Z] Copying: 420/1024 [MB] (15 MBps) [2024-12-15T05:21:33.836Z] Copying: 436/1024 [MB] (15 MBps) [2024-12-15T05:21:34.781Z] Copying: 449/1024 [MB] (13 MBps) [2024-12-15T05:21:35.939Z] Copying: 460/1024 [MB] (10 MBps) [2024-12-15T05:21:36.885Z] Copying: 471/1024 [MB] (10 MBps) [2024-12-15T05:21:37.829Z] Copying: 487/1024 [MB] (16 MBps) [2024-12-15T05:21:38.774Z] Copying: 499/1024 [MB] (11 MBps) [2024-12-15T05:21:39.718Z] Copying: 509/1024 [MB] (10 MBps) [2024-12-15T05:21:40.663Z] Copying: 520/1024 [MB] (10 MBps) [2024-12-15T05:21:42.051Z] Copying: 534/1024 [MB] (14 MBps) [2024-12-15T05:21:42.623Z] Copying: 553/1024 [MB] (19 MBps) [2024-12-15T05:21:44.011Z] Copying: 571/1024 [MB] (17 MBps) [2024-12-15T05:21:44.956Z] Copying: 589/1024 [MB] (18 MBps) [2024-12-15T05:21:45.909Z] Copying: 606/1024 [MB] (17 MBps) [2024-12-15T05:21:46.855Z] Copying: 618/1024 [MB] (11 MBps) [2024-12-15T05:21:47.813Z] Copying: 640/1024 [MB] (22 MBps) [2024-12-15T05:21:48.758Z] Copying: 654/1024 [MB] (13 MBps) [2024-12-15T05:21:49.704Z] Copying: 672/1024 [MB] (17 MBps) [2024-12-15T05:21:50.650Z] Copying: 687/1024 [MB] (15 MBps) [2024-12-15T05:21:52.041Z] Copying: 709/1024 [MB] (22 MBps) [2024-12-15T05:21:52.615Z] Copying: 720/1024 [MB] (10 MBps) [2024-12-15T05:21:54.003Z] Copying: 732/1024 [MB] (11 MBps) [2024-12-15T05:21:54.950Z] Copying: 749/1024 [MB] (16 MBps) [2024-12-15T05:21:55.895Z] Copying: 763/1024 [MB] (14 MBps) [2024-12-15T05:21:56.841Z] Copying: 777/1024 [MB] (13 MBps) [2024-12-15T05:21:57.786Z] Copying: 798/1024 [MB] (20 MBps) [2024-12-15T05:21:58.732Z] Copying: 808/1024 [MB] (10 MBps) [2024-12-15T05:21:59.676Z] Copying: 820/1024 [MB] (12 MBps) [2024-12-15T05:22:00.725Z] Copying: 832/1024 [MB] (11 MBps) [2024-12-15T05:22:01.667Z] Copying: 853/1024 [MB] (21 MBps) [2024-12-15T05:22:03.051Z] Copying: 863/1024 [MB] (10 MBps) [2024-12-15T05:22:03.623Z] Copying: 882/1024 [MB] (18 MBps) [2024-12-15T05:22:05.006Z] Copying: 895/1024 [MB] (13 MBps) [2024-12-15T05:22:05.948Z] Copying: 911/1024 [MB] (15 MBps) [2024-12-15T05:22:06.890Z] Copying: 927/1024 [MB] (15 MBps) [2024-12-15T05:22:07.835Z] Copying: 938/1024 [MB] (11 MBps) [2024-12-15T05:22:08.784Z] Copying: 952/1024 [MB] (13 MBps) [2024-12-15T05:22:09.729Z] Copying: 967/1024 [MB] (15 MBps) [2024-12-15T05:22:10.673Z] Copying: 986/1024 [MB] (18 MBps) [2024-12-15T05:22:11.616Z] Copying: 1004/1024 [MB] (18 MBps) [2024-12-15T05:22:12.190Z] Copying: 1015/1024 [MB] (10 MBps) [2024-12-15T05:22:12.763Z] Copying: 1024/1024 [MB] (average 15 MBps)[2024-12-15 05:22:12.491872] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:52.623 [2024-12-15 05:22:12.491936] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:31:52.623 [2024-12-15 05:22:12.491950] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:31:52.623 [2024-12-15 05:22:12.491959] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.623 [2024-12-15 05:22:12.491987] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:31:52.623 [2024-12-15 05:22:12.492497] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:52.623 [2024-12-15 05:22:12.492516] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:31:52.623 [2024-12-15 05:22:12.492526] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.495 ms 00:31:52.623 [2024-12-15 05:22:12.492538] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.623 [2024-12-15 05:22:12.492760] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:52.623 [2024-12-15 05:22:12.492768] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:31:52.623 [2024-12-15 05:22:12.492778] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.197 ms 00:31:52.623 [2024-12-15 05:22:12.492784] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.623 [2024-12-15 05:22:12.492808] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:52.623 [2024-12-15 05:22:12.492816] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Fast persist NV cache metadata 00:31:52.623 [2024-12-15 05:22:12.492823] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:31:52.623 [2024-12-15 05:22:12.492829] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.623 [2024-12-15 05:22:12.492869] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:52.623 [2024-12-15 05:22:12.492876] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL SHM clean state 00:31:52.623 [2024-12-15 05:22:12.492883] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.016 ms 00:31:52.623 [2024-12-15 05:22:12.492889] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.623 [2024-12-15 05:22:12.492901] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:31:52.623 [2024-12-15 05:22:12.492911] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.492921] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.492928] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.493054] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.493061] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.493068] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.493074] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.493081] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.493087] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.493094] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.493101] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.493107] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.493113] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.493120] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.493126] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:31:52.623 [2024-12-15 05:22:12.493132] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493137] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493143] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493148] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493154] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493160] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493166] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493171] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493177] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493183] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493189] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493201] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493207] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493212] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493218] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493224] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493229] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493235] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493240] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493246] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493251] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493257] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493262] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493268] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493274] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493279] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493285] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493290] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493296] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493301] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493308] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493314] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493320] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493325] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493331] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493337] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493344] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493349] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493355] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493361] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493366] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493372] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493377] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493383] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493388] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493394] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493399] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493405] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493410] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493416] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493421] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493427] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493760] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493869] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493891] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493914] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493936] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.493996] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494169] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494193] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494215] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494236] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494259] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494281] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494303] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494324] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494346] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494367] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494388] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494409] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494430] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494464] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494530] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494553] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494574] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494596] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494618] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494703] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494725] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494747] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494768] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494790] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494850] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494874] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494896] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:31:52.624 [2024-12-15 05:22:12.494925] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:31:52.624 [2024-12-15 05:22:12.494940] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 54c73827-c7ab-4552-9a64-af9833daf145 00:31:52.624 [2024-12-15 05:22:12.494963] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 0 00:31:52.624 [2024-12-15 05:22:12.494977] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 32 00:31:52.624 [2024-12-15 05:22:12.494992] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 0 00:31:52.624 [2024-12-15 05:22:12.495011] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: inf 00:31:52.624 [2024-12-15 05:22:12.495134] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:31:52.624 [2024-12-15 05:22:12.495159] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:31:52.625 [2024-12-15 05:22:12.495173] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:31:52.625 [2024-12-15 05:22:12.495187] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:31:52.625 [2024-12-15 05:22:12.495201] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:31:52.625 [2024-12-15 05:22:12.495216] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:52.625 [2024-12-15 05:22:12.495231] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:31:52.625 [2024-12-15 05:22:12.495281] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.316 ms 00:31:52.625 [2024-12-15 05:22:12.495303] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.496602] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:52.625 [2024-12-15 05:22:12.496681] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:31:52.625 [2024-12-15 05:22:12.496717] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.272 ms 00:31:52.625 [2024-12-15 05:22:12.496733] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.496810] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:52.625 [2024-12-15 05:22:12.496827] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:31:52.625 [2024-12-15 05:22:12.496846] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.053 ms 00:31:52.625 [2024-12-15 05:22:12.496860] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.501465] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:31:52.625 [2024-12-15 05:22:12.501548] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:31:52.625 [2024-12-15 05:22:12.501560] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:31:52.625 [2024-12-15 05:22:12.501566] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.501610] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:31:52.625 [2024-12-15 05:22:12.501617] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:31:52.625 [2024-12-15 05:22:12.501627] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:31:52.625 [2024-12-15 05:22:12.501633] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.501658] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:31:52.625 [2024-12-15 05:22:12.501665] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:31:52.625 [2024-12-15 05:22:12.501671] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:31:52.625 [2024-12-15 05:22:12.501677] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.501692] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:31:52.625 [2024-12-15 05:22:12.501698] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:31:52.625 [2024-12-15 05:22:12.501704] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:31:52.625 [2024-12-15 05:22:12.501712] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.510104] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:31:52.625 [2024-12-15 05:22:12.510210] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:31:52.625 [2024-12-15 05:22:12.510252] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:31:52.625 [2024-12-15 05:22:12.510269] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.518104] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:31:52.625 [2024-12-15 05:22:12.518208] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:31:52.625 [2024-12-15 05:22:12.518318] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:31:52.625 [2024-12-15 05:22:12.518336] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.518366] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:31:52.625 [2024-12-15 05:22:12.518382] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:31:52.625 [2024-12-15 05:22:12.518397] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:31:52.625 [2024-12-15 05:22:12.518453] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.518501] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:31:52.625 [2024-12-15 05:22:12.518523] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:31:52.625 [2024-12-15 05:22:12.518572] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:31:52.625 [2024-12-15 05:22:12.518595] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.518649] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:31:52.625 [2024-12-15 05:22:12.518670] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:31:52.625 [2024-12-15 05:22:12.518711] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:31:52.625 [2024-12-15 05:22:12.518728] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.518757] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:31:52.625 [2024-12-15 05:22:12.518774] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:31:52.625 [2024-12-15 05:22:12.518788] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:31:52.625 [2024-12-15 05:22:12.518802] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.518844] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:31:52.625 [2024-12-15 05:22:12.518865] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:31:52.625 [2024-12-15 05:22:12.518880] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:31:52.625 [2024-12-15 05:22:12.518894] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.518937] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:31:52.625 [2024-12-15 05:22:12.519121] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:31:52.625 [2024-12-15 05:22:12.519143] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:31:52.625 [2024-12-15 05:22:12.519158] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:52.625 [2024-12-15 05:22:12.519269] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL fast shutdown', duration = 27.385 ms, result 0 00:31:52.625 00:31:52.625 00:31:52.625 05:22:12 ftl.ftl_restore_fast -- ftl/restore.sh@76 -- # md5sum -c /home/vagrant/spdk_repo/spdk/test/ftl/testfile.md5 00:31:55.171 /home/vagrant/spdk_repo/spdk/test/ftl/testfile: OK 00:31:55.171 05:22:14 ftl.ftl_restore_fast -- ftl/restore.sh@79 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --if=/home/vagrant/spdk_repo/spdk/test/ftl/testfile --ob=ftl0 --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json --seek=131072 00:31:55.171 [2024-12-15 05:22:14.998152] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:31:55.171 [2024-12-15 05:22:14.998255] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid98106 ] 00:31:55.171 [2024-12-15 05:22:15.146067] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:55.171 [2024-12-15 05:22:15.164793] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:31:55.171 [2024-12-15 05:22:15.247041] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:31:55.171 [2024-12-15 05:22:15.247098] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:31:55.433 [2024-12-15 05:22:15.397469] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.433 [2024-12-15 05:22:15.397510] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:31:55.433 [2024-12-15 05:22:15.397522] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:31:55.433 [2024-12-15 05:22:15.397530] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.433 [2024-12-15 05:22:15.397580] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.433 [2024-12-15 05:22:15.397590] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:31:55.433 [2024-12-15 05:22:15.397601] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.034 ms 00:31:55.433 [2024-12-15 05:22:15.397608] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.433 [2024-12-15 05:22:15.397627] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:31:55.433 [2024-12-15 05:22:15.397865] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:31:55.433 [2024-12-15 05:22:15.397880] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.433 [2024-12-15 05:22:15.397889] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:31:55.433 [2024-12-15 05:22:15.397900] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.258 ms 00:31:55.433 [2024-12-15 05:22:15.397908] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.433 [2024-12-15 05:22:15.398133] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 1, shm_clean 1 00:31:55.433 [2024-12-15 05:22:15.398152] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.433 [2024-12-15 05:22:15.398166] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:31:55.433 [2024-12-15 05:22:15.398174] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.021 ms 00:31:55.433 [2024-12-15 05:22:15.398187] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.433 [2024-12-15 05:22:15.398238] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.433 [2024-12-15 05:22:15.398246] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:31:55.433 [2024-12-15 05:22:15.398254] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.029 ms 00:31:55.433 [2024-12-15 05:22:15.398264] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.433 [2024-12-15 05:22:15.398513] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.433 [2024-12-15 05:22:15.398524] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:31:55.433 [2024-12-15 05:22:15.398532] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.207 ms 00:31:55.433 [2024-12-15 05:22:15.398556] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.433 [2024-12-15 05:22:15.398664] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.433 [2024-12-15 05:22:15.398674] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:31:55.433 [2024-12-15 05:22:15.398682] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.060 ms 00:31:55.433 [2024-12-15 05:22:15.398689] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.433 [2024-12-15 05:22:15.398709] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.433 [2024-12-15 05:22:15.398717] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:31:55.433 [2024-12-15 05:22:15.398724] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.006 ms 00:31:55.433 [2024-12-15 05:22:15.398731] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.433 [2024-12-15 05:22:15.398749] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:31:55.433 [2024-12-15 05:22:15.400182] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.433 [2024-12-15 05:22:15.400234] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:31:55.433 [2024-12-15 05:22:15.400243] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 1.436 ms 00:31:55.433 [2024-12-15 05:22:15.400255] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.433 [2024-12-15 05:22:15.400284] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.433 [2024-12-15 05:22:15.400292] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:31:55.433 [2024-12-15 05:22:15.400299] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.010 ms 00:31:55.433 [2024-12-15 05:22:15.400309] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.433 [2024-12-15 05:22:15.400335] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:31:55.434 [2024-12-15 05:22:15.400361] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:31:55.434 [2024-12-15 05:22:15.400395] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:31:55.434 [2024-12-15 05:22:15.400412] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:31:55.434 [2024-12-15 05:22:15.400532] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:31:55.434 [2024-12-15 05:22:15.400543] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:31:55.434 [2024-12-15 05:22:15.400557] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:31:55.434 [2024-12-15 05:22:15.400570] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:31:55.434 [2024-12-15 05:22:15.400583] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:31:55.434 [2024-12-15 05:22:15.400590] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:31:55.434 [2024-12-15 05:22:15.400598] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:31:55.434 [2024-12-15 05:22:15.400606] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:31:55.434 [2024-12-15 05:22:15.400613] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:31:55.434 [2024-12-15 05:22:15.400620] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.434 [2024-12-15 05:22:15.400627] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:31:55.434 [2024-12-15 05:22:15.400638] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.286 ms 00:31:55.434 [2024-12-15 05:22:15.400644] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.434 [2024-12-15 05:22:15.400728] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.434 [2024-12-15 05:22:15.400740] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:31:55.434 [2024-12-15 05:22:15.400747] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.071 ms 00:31:55.434 [2024-12-15 05:22:15.400754] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.434 [2024-12-15 05:22:15.400849] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:31:55.434 [2024-12-15 05:22:15.400859] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:31:55.434 [2024-12-15 05:22:15.400868] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:31:55.434 [2024-12-15 05:22:15.400885] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:31:55.434 [2024-12-15 05:22:15.400893] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:31:55.434 [2024-12-15 05:22:15.400905] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:31:55.434 [2024-12-15 05:22:15.400913] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:31:55.434 [2024-12-15 05:22:15.400920] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:31:55.434 [2024-12-15 05:22:15.400928] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:31:55.434 [2024-12-15 05:22:15.400935] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:31:55.434 [2024-12-15 05:22:15.400942] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:31:55.434 [2024-12-15 05:22:15.400950] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:31:55.434 [2024-12-15 05:22:15.400957] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:31:55.434 [2024-12-15 05:22:15.400966] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:31:55.434 [2024-12-15 05:22:15.400973] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:31:55.434 [2024-12-15 05:22:15.400981] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:31:55.434 [2024-12-15 05:22:15.400988] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:31:55.434 [2024-12-15 05:22:15.400996] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:31:55.434 [2024-12-15 05:22:15.401003] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:31:55.434 [2024-12-15 05:22:15.401013] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:31:55.434 [2024-12-15 05:22:15.401021] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:31:55.434 [2024-12-15 05:22:15.401028] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:31:55.434 [2024-12-15 05:22:15.401035] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:31:55.434 [2024-12-15 05:22:15.401043] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:31:55.434 [2024-12-15 05:22:15.401050] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:31:55.434 [2024-12-15 05:22:15.401057] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:31:55.434 [2024-12-15 05:22:15.401065] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:31:55.434 [2024-12-15 05:22:15.401072] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:31:55.434 [2024-12-15 05:22:15.401079] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:31:55.434 [2024-12-15 05:22:15.401087] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:31:55.434 [2024-12-15 05:22:15.401095] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:31:55.434 [2024-12-15 05:22:15.401102] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:31:55.434 [2024-12-15 05:22:15.401109] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:31:55.434 [2024-12-15 05:22:15.401116] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:31:55.434 [2024-12-15 05:22:15.401124] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:31:55.434 [2024-12-15 05:22:15.401135] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:31:55.434 [2024-12-15 05:22:15.401142] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:31:55.434 [2024-12-15 05:22:15.401150] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:31:55.434 [2024-12-15 05:22:15.401157] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:31:55.434 [2024-12-15 05:22:15.401164] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:31:55.434 [2024-12-15 05:22:15.401172] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:31:55.434 [2024-12-15 05:22:15.401179] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:31:55.434 [2024-12-15 05:22:15.401186] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:31:55.434 [2024-12-15 05:22:15.401193] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:31:55.434 [2024-12-15 05:22:15.401201] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:31:55.434 [2024-12-15 05:22:15.401211] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:31:55.434 [2024-12-15 05:22:15.401221] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:31:55.434 [2024-12-15 05:22:15.401229] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:31:55.434 [2024-12-15 05:22:15.401237] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:31:55.434 [2024-12-15 05:22:15.401245] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:31:55.434 [2024-12-15 05:22:15.401252] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:31:55.434 [2024-12-15 05:22:15.401262] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:31:55.434 [2024-12-15 05:22:15.401270] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:31:55.434 [2024-12-15 05:22:15.401278] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:31:55.434 [2024-12-15 05:22:15.401289] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:31:55.434 [2024-12-15 05:22:15.401301] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:31:55.434 [2024-12-15 05:22:15.401308] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:31:55.434 [2024-12-15 05:22:15.401315] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:31:55.434 [2024-12-15 05:22:15.401322] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:31:55.434 [2024-12-15 05:22:15.401329] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:31:55.434 [2024-12-15 05:22:15.401335] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:31:55.434 [2024-12-15 05:22:15.401342] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:31:55.434 [2024-12-15 05:22:15.401349] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:31:55.434 [2024-12-15 05:22:15.401356] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:31:55.434 [2024-12-15 05:22:15.401363] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:31:55.434 [2024-12-15 05:22:15.401370] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:31:55.434 [2024-12-15 05:22:15.401376] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:31:55.434 [2024-12-15 05:22:15.401385] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:31:55.434 [2024-12-15 05:22:15.401392] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:31:55.434 [2024-12-15 05:22:15.401399] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:31:55.434 [2024-12-15 05:22:15.401407] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:31:55.434 [2024-12-15 05:22:15.401414] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:31:55.434 [2024-12-15 05:22:15.401422] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:31:55.434 [2024-12-15 05:22:15.401429] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:31:55.434 [2024-12-15 05:22:15.401446] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:31:55.434 [2024-12-15 05:22:15.401454] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.434 [2024-12-15 05:22:15.401461] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:31:55.434 [2024-12-15 05:22:15.401469] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.674 ms 00:31:55.435 [2024-12-15 05:22:15.401476] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.407566] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.407598] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:31:55.435 [2024-12-15 05:22:15.407608] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.038 ms 00:31:55.435 [2024-12-15 05:22:15.407615] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.407690] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.407702] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:31:55.435 [2024-12-15 05:22:15.407710] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.058 ms 00:31:55.435 [2024-12-15 05:22:15.407717] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.424139] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.424184] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:31:55.435 [2024-12-15 05:22:15.424217] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 16.381 ms 00:31:55.435 [2024-12-15 05:22:15.424226] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.424262] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.424272] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:31:55.435 [2024-12-15 05:22:15.424285] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.003 ms 00:31:55.435 [2024-12-15 05:22:15.424293] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.424380] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.424394] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:31:55.435 [2024-12-15 05:22:15.424406] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.040 ms 00:31:55.435 [2024-12-15 05:22:15.424417] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.424545] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.424555] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:31:55.435 [2024-12-15 05:22:15.424563] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.114 ms 00:31:55.435 [2024-12-15 05:22:15.424576] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.430596] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.430762] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:31:55.435 [2024-12-15 05:22:15.430799] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.002 ms 00:31:55.435 [2024-12-15 05:22:15.430810] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.430939] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 2, empty chunks = 2 00:31:55.435 [2024-12-15 05:22:15.430960] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:31:55.435 [2024-12-15 05:22:15.430973] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.430984] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:31:55.435 [2024-12-15 05:22:15.430995] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.066 ms 00:31:55.435 [2024-12-15 05:22:15.431008] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.445007] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.445046] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:31:55.435 [2024-12-15 05:22:15.445055] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 13.980 ms 00:31:55.435 [2024-12-15 05:22:15.445068] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.445183] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.445192] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:31:55.435 [2024-12-15 05:22:15.445200] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.091 ms 00:31:55.435 [2024-12-15 05:22:15.445210] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.445249] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.445264] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:31:55.435 [2024-12-15 05:22:15.445272] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.001 ms 00:31:55.435 [2024-12-15 05:22:15.445278] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.445590] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.445660] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:31:55.435 [2024-12-15 05:22:15.445668] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.275 ms 00:31:55.435 [2024-12-15 05:22:15.445675] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.445692] mngt/ftl_mngt_p2l.c: 169:ftl_mngt_p2l_restore_ckpt: *NOTICE*: [FTL][ftl0] SHM: skipping p2l ckpt restore 00:31:55.435 [2024-12-15 05:22:15.445702] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.445712] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:31:55.435 [2024-12-15 05:22:15.445719] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.014 ms 00:31:55.435 [2024-12-15 05:22:15.445726] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.453712] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 9 (of 10) MiB 00:31:55.435 [2024-12-15 05:22:15.453837] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.453851] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:31:55.435 [2024-12-15 05:22:15.453860] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.094 ms 00:31:55.435 [2024-12-15 05:22:15.453872] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.456180] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.456210] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:31:55.435 [2024-12-15 05:22:15.456219] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.290 ms 00:31:55.435 [2024-12-15 05:22:15.456226] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.456290] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.456299] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:31:55.435 [2024-12-15 05:22:15.456307] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.032 ms 00:31:55.435 [2024-12-15 05:22:15.456316] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.456348] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.456357] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:31:55.435 [2024-12-15 05:22:15.456365] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:31:55.435 [2024-12-15 05:22:15.456375] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.456402] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:31:55.435 [2024-12-15 05:22:15.456412] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.456421] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:31:55.435 [2024-12-15 05:22:15.456428] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.010 ms 00:31:55.435 [2024-12-15 05:22:15.456451] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.460839] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.460961] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:31:55.435 [2024-12-15 05:22:15.460983] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 4.366 ms 00:31:55.435 [2024-12-15 05:22:15.460991] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.461052] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:31:55.435 [2024-12-15 05:22:15.461061] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:31:55.435 [2024-12-15 05:22:15.461073] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.032 ms 00:31:55.435 [2024-12-15 05:22:15.461080] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:31:55.435 [2024-12-15 05:22:15.462239] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 64.404 ms, result 0 00:31:56.378  [2024-12-15T05:22:17.905Z] Copying: 14/1024 [MB] (14 MBps) [2024-12-15T05:22:18.477Z] Copying: 30/1024 [MB] (16 MBps) [2024-12-15T05:22:19.866Z] Copying: 50/1024 [MB] (19 MBps) [2024-12-15T05:22:20.810Z] Copying: 76/1024 [MB] (26 MBps) [2024-12-15T05:22:21.754Z] Copying: 101/1024 [MB] (24 MBps) [2024-12-15T05:22:22.698Z] Copying: 133/1024 [MB] (31 MBps) [2024-12-15T05:22:23.641Z] Copying: 149/1024 [MB] (16 MBps) [2024-12-15T05:22:24.584Z] Copying: 172/1024 [MB] (22 MBps) [2024-12-15T05:22:25.526Z] Copying: 185/1024 [MB] (12 MBps) [2024-12-15T05:22:26.912Z] Copying: 216/1024 [MB] (31 MBps) [2024-12-15T05:22:27.484Z] Copying: 236/1024 [MB] (20 MBps) [2024-12-15T05:22:28.870Z] Copying: 271/1024 [MB] (34 MBps) [2024-12-15T05:22:29.814Z] Copying: 304/1024 [MB] (33 MBps) [2024-12-15T05:22:30.790Z] Copying: 331/1024 [MB] (27 MBps) [2024-12-15T05:22:31.776Z] Copying: 348/1024 [MB] (17 MBps) [2024-12-15T05:22:32.720Z] Copying: 367/1024 [MB] (18 MBps) [2024-12-15T05:22:33.663Z] Copying: 379/1024 [MB] (11 MBps) [2024-12-15T05:22:34.607Z] Copying: 391/1024 [MB] (11 MBps) [2024-12-15T05:22:35.549Z] Copying: 411/1024 [MB] (19 MBps) [2024-12-15T05:22:36.492Z] Copying: 434/1024 [MB] (23 MBps) [2024-12-15T05:22:37.877Z] Copying: 450/1024 [MB] (15 MBps) [2024-12-15T05:22:38.820Z] Copying: 471/1024 [MB] (20 MBps) [2024-12-15T05:22:39.763Z] Copying: 493/1024 [MB] (21 MBps) [2024-12-15T05:22:40.706Z] Copying: 510/1024 [MB] (17 MBps) [2024-12-15T05:22:41.648Z] Copying: 527/1024 [MB] (17 MBps) [2024-12-15T05:22:42.589Z] Copying: 546/1024 [MB] (18 MBps) [2024-12-15T05:22:43.534Z] Copying: 567/1024 [MB] (20 MBps) [2024-12-15T05:22:44.478Z] Copying: 591/1024 [MB] (23 MBps) [2024-12-15T05:22:45.862Z] Copying: 608/1024 [MB] (17 MBps) [2024-12-15T05:22:46.805Z] Copying: 623/1024 [MB] (14 MBps) [2024-12-15T05:22:47.750Z] Copying: 642/1024 [MB] (19 MBps) [2024-12-15T05:22:48.691Z] Copying: 671/1024 [MB] (28 MBps) [2024-12-15T05:22:49.635Z] Copying: 683/1024 [MB] (12 MBps) [2024-12-15T05:22:50.579Z] Copying: 709/1024 [MB] (26 MBps) [2024-12-15T05:22:51.522Z] Copying: 721/1024 [MB] (12 MBps) [2024-12-15T05:22:52.909Z] Copying: 743/1024 [MB] (22 MBps) [2024-12-15T05:22:53.482Z] Copying: 755/1024 [MB] (11 MBps) [2024-12-15T05:22:54.870Z] Copying: 770/1024 [MB] (15 MBps) [2024-12-15T05:22:55.813Z] Copying: 783/1024 [MB] (12 MBps) [2024-12-15T05:22:56.756Z] Copying: 800/1024 [MB] (16 MBps) [2024-12-15T05:22:57.698Z] Copying: 810/1024 [MB] (10 MBps) [2024-12-15T05:22:58.642Z] Copying: 831/1024 [MB] (20 MBps) [2024-12-15T05:22:59.656Z] Copying: 850/1024 [MB] (19 MBps) [2024-12-15T05:23:00.600Z] Copying: 868/1024 [MB] (17 MBps) [2024-12-15T05:23:01.542Z] Copying: 889/1024 [MB] (20 MBps) [2024-12-15T05:23:02.486Z] Copying: 900/1024 [MB] (11 MBps) [2024-12-15T05:23:03.875Z] Copying: 920/1024 [MB] (20 MBps) [2024-12-15T05:23:04.819Z] Copying: 939/1024 [MB] (18 MBps) [2024-12-15T05:23:05.762Z] Copying: 955/1024 [MB] (16 MBps) [2024-12-15T05:23:06.707Z] Copying: 977/1024 [MB] (21 MBps) [2024-12-15T05:23:07.651Z] Copying: 990/1024 [MB] (12 MBps) [2024-12-15T05:23:08.595Z] Copying: 1008/1024 [MB] (18 MBps) [2024-12-15T05:23:09.539Z] Copying: 1023/1024 [MB] (14 MBps) [2024-12-15T05:23:09.539Z] Copying: 1024/1024 [MB] (average 19 MBps)[2024-12-15 05:23:09.328126] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:49.399 [2024-12-15 05:23:09.328369] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:32:49.399 [2024-12-15 05:23:09.328411] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:32:49.399 [2024-12-15 05:23:09.328423] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.399 [2024-12-15 05:23:09.332177] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:32:49.399 [2024-12-15 05:23:09.334333] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:49.399 [2024-12-15 05:23:09.334377] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:32:49.399 [2024-12-15 05:23:09.334389] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.079 ms 00:32:49.399 [2024-12-15 05:23:09.334400] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.399 [2024-12-15 05:23:09.345174] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:49.399 [2024-12-15 05:23:09.345222] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:32:49.399 [2024-12-15 05:23:09.345235] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.519 ms 00:32:49.399 [2024-12-15 05:23:09.345243] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.399 [2024-12-15 05:23:09.345275] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:49.399 [2024-12-15 05:23:09.345284] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Fast persist NV cache metadata 00:32:49.399 [2024-12-15 05:23:09.345294] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:32:49.399 [2024-12-15 05:23:09.345303] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.399 [2024-12-15 05:23:09.345367] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:49.399 [2024-12-15 05:23:09.345380] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL SHM clean state 00:32:49.399 [2024-12-15 05:23:09.345389] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.024 ms 00:32:49.399 [2024-12-15 05:23:09.345403] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.399 [2024-12-15 05:23:09.345418] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:32:49.399 [2024-12-15 05:23:09.345431] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 127488 / 261120 wr_cnt: 1 state: open 00:32:49.399 [2024-12-15 05:23:09.345464] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:32:49.399 [2024-12-15 05:23:09.345472] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:32:49.399 [2024-12-15 05:23:09.345480] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:32:49.399 [2024-12-15 05:23:09.345488] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:32:49.399 [2024-12-15 05:23:09.345496] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:32:49.399 [2024-12-15 05:23:09.345504] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:32:49.399 [2024-12-15 05:23:09.345511] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:32:49.399 [2024-12-15 05:23:09.345521] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:32:49.399 [2024-12-15 05:23:09.345529] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:32:49.399 [2024-12-15 05:23:09.345538] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345546] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345553] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345561] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345569] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345577] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345585] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345592] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345600] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345607] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345616] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345624] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345632] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345640] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345650] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345658] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345675] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345683] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345691] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345698] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345706] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345715] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345723] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345730] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345737] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345745] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345752] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345760] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345769] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345776] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345784] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345792] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345800] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345808] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345815] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345823] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345831] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345838] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345846] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345853] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345862] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345869] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345877] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345885] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345893] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345901] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345913] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345921] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345928] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345936] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345944] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345952] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345959] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345967] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345975] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345982] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345990] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.345998] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346005] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346013] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346020] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346028] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346035] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346043] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346050] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346057] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346065] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346072] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346081] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346088] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346096] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346103] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346111] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346119] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346127] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346135] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346143] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346150] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346160] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346168] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346175] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346183] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346191] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346199] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346206] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346215] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346223] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346231] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346238] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346246] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:32:49.400 [2024-12-15 05:23:09.346261] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:32:49.400 [2024-12-15 05:23:09.346270] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 54c73827-c7ab-4552-9a64-af9833daf145 00:32:49.400 [2024-12-15 05:23:09.346278] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 127488 00:32:49.401 [2024-12-15 05:23:09.346286] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 127520 00:32:49.401 [2024-12-15 05:23:09.346293] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 127488 00:32:49.401 [2024-12-15 05:23:09.346300] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: 1.0003 00:32:49.401 [2024-12-15 05:23:09.346314] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:32:49.401 [2024-12-15 05:23:09.346322] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:32:49.401 [2024-12-15 05:23:09.346331] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:32:49.401 [2024-12-15 05:23:09.346338] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:32:49.401 [2024-12-15 05:23:09.346345] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:32:49.401 [2024-12-15 05:23:09.346352] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:49.401 [2024-12-15 05:23:09.346361] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:32:49.401 [2024-12-15 05:23:09.346369] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.935 ms 00:32:49.401 [2024-12-15 05:23:09.346376] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.348813] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:49.401 [2024-12-15 05:23:09.348840] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:32:49.401 [2024-12-15 05:23:09.348854] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.420 ms 00:32:49.401 [2024-12-15 05:23:09.348862] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.348988] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:49.401 [2024-12-15 05:23:09.349002] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:32:49.401 [2024-12-15 05:23:09.349011] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.103 ms 00:32:49.401 [2024-12-15 05:23:09.349019] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.356545] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:32:49.401 [2024-12-15 05:23:09.356588] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:32:49.401 [2024-12-15 05:23:09.356599] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:32:49.401 [2024-12-15 05:23:09.356606] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.356671] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:32:49.401 [2024-12-15 05:23:09.356680] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:32:49.401 [2024-12-15 05:23:09.356688] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:32:49.401 [2024-12-15 05:23:09.356696] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.356753] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:32:49.401 [2024-12-15 05:23:09.356767] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:32:49.401 [2024-12-15 05:23:09.356775] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:32:49.401 [2024-12-15 05:23:09.356783] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.356799] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:32:49.401 [2024-12-15 05:23:09.356808] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:32:49.401 [2024-12-15 05:23:09.356816] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:32:49.401 [2024-12-15 05:23:09.356823] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.370232] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:32:49.401 [2024-12-15 05:23:09.370282] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:32:49.401 [2024-12-15 05:23:09.370293] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:32:49.401 [2024-12-15 05:23:09.370302] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.380846] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:32:49.401 [2024-12-15 05:23:09.380890] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:32:49.401 [2024-12-15 05:23:09.380901] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:32:49.401 [2024-12-15 05:23:09.380909] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.380955] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:32:49.401 [2024-12-15 05:23:09.380964] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:32:49.401 [2024-12-15 05:23:09.380973] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:32:49.401 [2024-12-15 05:23:09.380987] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.381023] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:32:49.401 [2024-12-15 05:23:09.381033] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:32:49.401 [2024-12-15 05:23:09.381047] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:32:49.401 [2024-12-15 05:23:09.381061] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.381114] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:32:49.401 [2024-12-15 05:23:09.381123] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:32:49.401 [2024-12-15 05:23:09.381132] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:32:49.401 [2024-12-15 05:23:09.381142] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.381172] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:32:49.401 [2024-12-15 05:23:09.381182] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:32:49.401 [2024-12-15 05:23:09.381189] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:32:49.401 [2024-12-15 05:23:09.381196] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.381235] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:32:49.401 [2024-12-15 05:23:09.381244] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:32:49.401 [2024-12-15 05:23:09.381253] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:32:49.401 [2024-12-15 05:23:09.381260] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.381305] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:32:49.401 [2024-12-15 05:23:09.381316] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:32:49.401 [2024-12-15 05:23:09.381324] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:32:49.401 [2024-12-15 05:23:09.381333] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:49.401 [2024-12-15 05:23:09.381483] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL fast shutdown', duration = 54.957 ms, result 0 00:32:49.971 00:32:49.971 00:32:49.971 05:23:10 ftl.ftl_restore_fast -- ftl/restore.sh@80 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_dd --ib=ftl0 --of=/home/vagrant/spdk_repo/spdk/test/ftl/testfile --json=/home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json --skip=131072 --count=262144 00:32:50.233 [2024-12-15 05:23:10.155329] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:32:50.233 [2024-12-15 05:23:10.155487] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid98659 ] 00:32:50.233 [2024-12-15 05:23:10.315999] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:50.233 [2024-12-15 05:23:10.345109] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:32:50.494 [2024-12-15 05:23:10.455408] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:32:50.494 [2024-12-15 05:23:10.455506] bdev.c:8697:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: nvc0n1 00:32:50.494 [2024-12-15 05:23:10.617339] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.494 [2024-12-15 05:23:10.617399] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Check configuration 00:32:50.494 [2024-12-15 05:23:10.617414] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:32:50.494 [2024-12-15 05:23:10.617430] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.494 [2024-12-15 05:23:10.617507] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.494 [2024-12-15 05:23:10.617518] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:32:50.494 [2024-12-15 05:23:10.617527] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.043 ms 00:32:50.494 [2024-12-15 05:23:10.617535] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.494 [2024-12-15 05:23:10.617560] mngt/ftl_mngt_bdev.c: 196:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using nvc0n1p0 as write buffer cache 00:32:50.494 [2024-12-15 05:23:10.617953] mngt/ftl_mngt_bdev.c: 236:ftl_mngt_open_cache_bdev: *NOTICE*: [FTL][ftl0] Using bdev as NV Cache device 00:32:50.494 [2024-12-15 05:23:10.617996] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.494 [2024-12-15 05:23:10.618004] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:32:50.494 [2024-12-15 05:23:10.618017] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.440 ms 00:32:50.494 [2024-12-15 05:23:10.618025] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.494 [2024-12-15 05:23:10.618648] mngt/ftl_mngt_md.c: 455:ftl_mngt_load_sb: *NOTICE*: [FTL][ftl0] SHM: clean 1, shm_clean 1 00:32:50.494 [2024-12-15 05:23:10.618696] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.494 [2024-12-15 05:23:10.618706] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Load super block 00:32:50.494 [2024-12-15 05:23:10.618718] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.064 ms 00:32:50.494 [2024-12-15 05:23:10.618737] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.494 [2024-12-15 05:23:10.618806] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.494 [2024-12-15 05:23:10.618817] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Validate super block 00:32:50.494 [2024-12-15 05:23:10.618825] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.041 ms 00:32:50.494 [2024-12-15 05:23:10.618833] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.494 [2024-12-15 05:23:10.619098] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.494 [2024-12-15 05:23:10.619110] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:32:50.494 [2024-12-15 05:23:10.619119] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.231 ms 00:32:50.494 [2024-12-15 05:23:10.619129] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.494 [2024-12-15 05:23:10.619219] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.494 [2024-12-15 05:23:10.619236] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:32:50.495 [2024-12-15 05:23:10.619249] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.073 ms 00:32:50.495 [2024-12-15 05:23:10.619257] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.495 [2024-12-15 05:23:10.619290] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.495 [2024-12-15 05:23:10.619299] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Register IO device 00:32:50.495 [2024-12-15 05:23:10.619307] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.007 ms 00:32:50.495 [2024-12-15 05:23:10.619314] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.495 [2024-12-15 05:23:10.619336] mngt/ftl_mngt_ioch.c: 57:io_channel_create_cb: *NOTICE*: [FTL][ftl0] FTL IO channel created on app_thread 00:32:50.495 [2024-12-15 05:23:10.621502] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.495 [2024-12-15 05:23:10.621536] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:32:50.495 [2024-12-15 05:23:10.621547] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.171 ms 00:32:50.495 [2024-12-15 05:23:10.621555] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.495 [2024-12-15 05:23:10.621595] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.495 [2024-12-15 05:23:10.621605] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Decorate bands 00:32:50.495 [2024-12-15 05:23:10.621614] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.013 ms 00:32:50.495 [2024-12-15 05:23:10.621628] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.495 [2024-12-15 05:23:10.621687] ftl_layout.c: 613:ftl_layout_setup: *NOTICE*: [FTL][ftl0] FTL layout setup mode 0 00:32:50.495 [2024-12-15 05:23:10.621716] upgrade/ftl_sb_v5.c: 278:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob load 0x150 bytes 00:32:50.495 [2024-12-15 05:23:10.621753] upgrade/ftl_sb_v5.c: 287:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] base layout blob load 0x48 bytes 00:32:50.495 [2024-12-15 05:23:10.621775] upgrade/ftl_sb_v5.c: 294:ftl_superblock_v5_load_blob_area: *NOTICE*: [FTL][ftl0] layout blob load 0x190 bytes 00:32:50.495 [2024-12-15 05:23:10.621882] upgrade/ftl_sb_v5.c: 92:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] nvc layout blob store 0x150 bytes 00:32:50.495 [2024-12-15 05:23:10.621893] upgrade/ftl_sb_v5.c: 101:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] base layout blob store 0x48 bytes 00:32:50.495 [2024-12-15 05:23:10.621904] upgrade/ftl_sb_v5.c: 109:ftl_superblock_v5_store_blob_area: *NOTICE*: [FTL][ftl0] layout blob store 0x190 bytes 00:32:50.495 [2024-12-15 05:23:10.621915] ftl_layout.c: 685:ftl_layout_setup: *NOTICE*: [FTL][ftl0] Base device capacity: 103424.00 MiB 00:32:50.495 [2024-12-15 05:23:10.621928] ftl_layout.c: 687:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache device capacity: 5171.00 MiB 00:32:50.495 [2024-12-15 05:23:10.621938] ftl_layout.c: 689:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P entries: 20971520 00:32:50.495 [2024-12-15 05:23:10.621946] ftl_layout.c: 690:ftl_layout_setup: *NOTICE*: [FTL][ftl0] L2P address size: 4 00:32:50.495 [2024-12-15 05:23:10.621958] ftl_layout.c: 691:ftl_layout_setup: *NOTICE*: [FTL][ftl0] P2L checkpoint pages: 2048 00:32:50.495 [2024-12-15 05:23:10.621967] ftl_layout.c: 692:ftl_layout_setup: *NOTICE*: [FTL][ftl0] NV cache chunk count 5 00:32:50.495 [2024-12-15 05:23:10.621975] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.495 [2024-12-15 05:23:10.621987] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize layout 00:32:50.495 [2024-12-15 05:23:10.621999] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.291 ms 00:32:50.495 [2024-12-15 05:23:10.622010] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.495 [2024-12-15 05:23:10.622095] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.495 [2024-12-15 05:23:10.622106] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Verify layout 00:32:50.495 [2024-12-15 05:23:10.622113] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.069 ms 00:32:50.495 [2024-12-15 05:23:10.622121] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.495 [2024-12-15 05:23:10.622217] ftl_layout.c: 768:ftl_layout_dump: *NOTICE*: [FTL][ftl0] NV cache layout: 00:32:50.495 [2024-12-15 05:23:10.622235] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb 00:32:50.495 [2024-12-15 05:23:10.622243] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:32:50.495 [2024-12-15 05:23:10.622251] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:32:50.495 [2024-12-15 05:23:10.622264] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region l2p 00:32:50.495 [2024-12-15 05:23:10.622280] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.12 MiB 00:32:50.495 [2024-12-15 05:23:10.622287] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 80.00 MiB 00:32:50.495 [2024-12-15 05:23:10.622294] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md 00:32:50.495 [2024-12-15 05:23:10.622304] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.12 MiB 00:32:50.495 [2024-12-15 05:23:10.622311] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:32:50.495 [2024-12-15 05:23:10.622318] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region band_md_mirror 00:32:50.495 [2024-12-15 05:23:10.622324] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 80.62 MiB 00:32:50.495 [2024-12-15 05:23:10.622331] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.50 MiB 00:32:50.495 [2024-12-15 05:23:10.622338] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md 00:32:50.495 [2024-12-15 05:23:10.622346] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.88 MiB 00:32:50.495 [2024-12-15 05:23:10.622352] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:32:50.495 [2024-12-15 05:23:10.622359] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region nvc_md_mirror 00:32:50.495 [2024-12-15 05:23:10.622365] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 114.00 MiB 00:32:50.495 [2024-12-15 05:23:10.622372] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:32:50.495 [2024-12-15 05:23:10.622379] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l0 00:32:50.495 [2024-12-15 05:23:10.622386] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 81.12 MiB 00:32:50.495 [2024-12-15 05:23:10.622392] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:32:50.495 [2024-12-15 05:23:10.622399] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l1 00:32:50.495 [2024-12-15 05:23:10.622406] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 89.12 MiB 00:32:50.495 [2024-12-15 05:23:10.622415] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:32:50.495 [2024-12-15 05:23:10.622422] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l2 00:32:50.495 [2024-12-15 05:23:10.622429] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 97.12 MiB 00:32:50.495 [2024-12-15 05:23:10.622453] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:32:50.495 [2024-12-15 05:23:10.622461] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region p2l3 00:32:50.495 [2024-12-15 05:23:10.622467] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 105.12 MiB 00:32:50.495 [2024-12-15 05:23:10.622474] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 8.00 MiB 00:32:50.495 [2024-12-15 05:23:10.622480] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md 00:32:50.495 [2024-12-15 05:23:10.622487] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.12 MiB 00:32:50.495 [2024-12-15 05:23:10.622493] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:32:50.495 [2024-12-15 05:23:10.622500] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_md_mirror 00:32:50.495 [2024-12-15 05:23:10.622506] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.38 MiB 00:32:50.495 [2024-12-15 05:23:10.622513] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.25 MiB 00:32:50.495 [2024-12-15 05:23:10.622522] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log 00:32:50.495 [2024-12-15 05:23:10.622530] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.62 MiB 00:32:50.495 [2024-12-15 05:23:10.622537] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:32:50.495 [2024-12-15 05:23:10.622546] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region trim_log_mirror 00:32:50.495 [2024-12-15 05:23:10.622554] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 113.75 MiB 00:32:50.495 [2024-12-15 05:23:10.622560] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:32:50.495 [2024-12-15 05:23:10.622567] ftl_layout.c: 775:ftl_layout_dump: *NOTICE*: [FTL][ftl0] Base device layout: 00:32:50.495 [2024-12-15 05:23:10.622575] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region sb_mirror 00:32:50.495 [2024-12-15 05:23:10.622583] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.00 MiB 00:32:50.495 [2024-12-15 05:23:10.622594] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 0.12 MiB 00:32:50.495 [2024-12-15 05:23:10.622606] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region vmap 00:32:50.495 [2024-12-15 05:23:10.622613] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 102400.25 MiB 00:32:50.495 [2024-12-15 05:23:10.622620] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 3.38 MiB 00:32:50.495 [2024-12-15 05:23:10.622627] ftl_layout.c: 130:dump_region: *NOTICE*: [FTL][ftl0] Region data_btm 00:32:50.495 [2024-12-15 05:23:10.622633] ftl_layout.c: 131:dump_region: *NOTICE*: [FTL][ftl0] offset: 0.25 MiB 00:32:50.495 [2024-12-15 05:23:10.622640] ftl_layout.c: 133:dump_region: *NOTICE*: [FTL][ftl0] blocks: 102400.00 MiB 00:32:50.495 [2024-12-15 05:23:10.622648] upgrade/ftl_sb_v5.c: 408:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - nvc: 00:32:50.495 [2024-12-15 05:23:10.622659] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x0 ver:5 blk_offs:0x0 blk_sz:0x20 00:32:50.495 [2024-12-15 05:23:10.622667] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x2 ver:0 blk_offs:0x20 blk_sz:0x5000 00:32:50.495 [2024-12-15 05:23:10.622677] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x3 ver:2 blk_offs:0x5020 blk_sz:0x80 00:32:50.495 [2024-12-15 05:23:10.622684] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x4 ver:2 blk_offs:0x50a0 blk_sz:0x80 00:32:50.495 [2024-12-15 05:23:10.622691] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xa ver:2 blk_offs:0x5120 blk_sz:0x800 00:32:50.495 [2024-12-15 05:23:10.622697] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xb ver:2 blk_offs:0x5920 blk_sz:0x800 00:32:50.496 [2024-12-15 05:23:10.622705] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xc ver:2 blk_offs:0x6120 blk_sz:0x800 00:32:50.496 [2024-12-15 05:23:10.622713] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xd ver:2 blk_offs:0x6920 blk_sz:0x800 00:32:50.496 [2024-12-15 05:23:10.622719] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xe ver:0 blk_offs:0x7120 blk_sz:0x40 00:32:50.496 [2024-12-15 05:23:10.622726] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xf ver:0 blk_offs:0x7160 blk_sz:0x40 00:32:50.496 [2024-12-15 05:23:10.622733] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x10 ver:1 blk_offs:0x71a0 blk_sz:0x20 00:32:50.496 [2024-12-15 05:23:10.622740] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x11 ver:1 blk_offs:0x71c0 blk_sz:0x20 00:32:50.496 [2024-12-15 05:23:10.622747] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x6 ver:2 blk_offs:0x71e0 blk_sz:0x20 00:32:50.496 [2024-12-15 05:23:10.622754] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x7 ver:2 blk_offs:0x7200 blk_sz:0x20 00:32:50.496 [2024-12-15 05:23:10.622762] upgrade/ftl_sb_v5.c: 416:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x7220 blk_sz:0x13c0e0 00:32:50.496 [2024-12-15 05:23:10.622771] upgrade/ftl_sb_v5.c: 422:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] SB metadata layout - base dev: 00:32:50.496 [2024-12-15 05:23:10.622780] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x1 ver:5 blk_offs:0x0 blk_sz:0x20 00:32:50.496 [2024-12-15 05:23:10.622789] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x20 blk_sz:0x20 00:32:50.496 [2024-12-15 05:23:10.622798] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x9 ver:0 blk_offs:0x40 blk_sz:0x1900000 00:32:50.496 [2024-12-15 05:23:10.622806] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0x5 ver:0 blk_offs:0x1900040 blk_sz:0x360 00:32:50.496 [2024-12-15 05:23:10.622813] upgrade/ftl_sb_v5.c: 430:ftl_superblock_v5_md_layout_dump: *NOTICE*: [FTL][ftl0] Region type:0xfffffffe ver:0 blk_offs:0x19003a0 blk_sz:0x3fc60 00:32:50.496 [2024-12-15 05:23:10.622820] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.496 [2024-12-15 05:23:10.622828] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Layout upgrade 00:32:50.496 [2024-12-15 05:23:10.622836] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.673 ms 00:32:50.496 [2024-12-15 05:23:10.622844] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.632744] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.632785] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:32:50.759 [2024-12-15 05:23:10.632796] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 9.858 ms 00:32:50.759 [2024-12-15 05:23:10.632803] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.632885] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.632894] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize band addresses 00:32:50.759 [2024-12-15 05:23:10.632903] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.060 ms 00:32:50.759 [2024-12-15 05:23:10.632917] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.652112] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.652176] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:32:50.759 [2024-12-15 05:23:10.652203] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 19.140 ms 00:32:50.759 [2024-12-15 05:23:10.652214] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.652267] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.652288] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:32:50.759 [2024-12-15 05:23:10.652300] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.004 ms 00:32:50.759 [2024-12-15 05:23:10.652310] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.652432] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.652478] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:32:50.759 [2024-12-15 05:23:10.652490] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.056 ms 00:32:50.759 [2024-12-15 05:23:10.652499] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.652653] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.652665] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:32:50.759 [2024-12-15 05:23:10.652676] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.132 ms 00:32:50.759 [2024-12-15 05:23:10.652685] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.660887] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.660928] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:32:50.759 [2024-12-15 05:23:10.660955] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 8.178 ms 00:32:50.759 [2024-12-15 05:23:10.660963] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.661089] ftl_nv_cache.c:1772:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: full chunks = 4, empty chunks = 0 00:32:50.759 [2024-12-15 05:23:10.661102] ftl_nv_cache.c:1776:ftl_nv_cache_load_state: *NOTICE*: [FTL][ftl0] FTL NV Cache: state loaded successfully 00:32:50.759 [2024-12-15 05:23:10.661112] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.661120] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore NV cache metadata 00:32:50.759 [2024-12-15 05:23:10.661128] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.055 ms 00:32:50.759 [2024-12-15 05:23:10.661138] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.673440] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.673490] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore valid map metadata 00:32:50.759 [2024-12-15 05:23:10.673501] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 12.277 ms 00:32:50.759 [2024-12-15 05:23:10.673517] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.673653] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.673664] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore band info metadata 00:32:50.759 [2024-12-15 05:23:10.673673] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.107 ms 00:32:50.759 [2024-12-15 05:23:10.673685] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.673737] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.673752] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore trim metadata 00:32:50.759 [2024-12-15 05:23:10.673766] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.001 ms 00:32:50.759 [2024-12-15 05:23:10.673773] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.674082] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.674096] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize P2L checkpointing 00:32:50.759 [2024-12-15 05:23:10.674104] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.270 ms 00:32:50.759 [2024-12-15 05:23:10.674112] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.674127] mngt/ftl_mngt_p2l.c: 169:ftl_mngt_p2l_restore_ckpt: *NOTICE*: [FTL][ftl0] SHM: skipping p2l ckpt restore 00:32:50.759 [2024-12-15 05:23:10.674136] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.674148] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore P2L checkpoints 00:32:50.759 [2024-12-15 05:23:10.674155] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.011 ms 00:32:50.759 [2024-12-15 05:23:10.674162] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.683872] ftl_l2p_cache.c: 458:ftl_l2p_cache_init: *NOTICE*: l2p maximum resident size is: 9 (of 10) MiB 00:32:50.759 [2024-12-15 05:23:10.684026] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.684037] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize L2P 00:32:50.759 [2024-12-15 05:23:10.684053] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 9.847 ms 00:32:50.759 [2024-12-15 05:23:10.684065] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.686643] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.686675] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Restore L2P 00:32:50.759 [2024-12-15 05:23:10.686685] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.554 ms 00:32:50.759 [2024-12-15 05:23:10.686693] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.686772] mngt/ftl_mngt_band.c: 414:ftl_mngt_finalize_init_bands: *NOTICE*: [FTL][ftl0] SHM: band open P2L map df_id 0x2400000 00:32:50.759 [2024-12-15 05:23:10.687525] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.687546] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize band initialization 00:32:50.759 [2024-12-15 05:23:10.687560] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.774 ms 00:32:50.759 [2024-12-15 05:23:10.687568] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.687598] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.687614] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Start core poller 00:32:50.759 [2024-12-15 05:23:10.687623] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:32:50.759 [2024-12-15 05:23:10.687631] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.687672] mngt/ftl_mngt_self_test.c: 208:ftl_mngt_self_test: *NOTICE*: [FTL][ftl0] Self test skipped 00:32:50.759 [2024-12-15 05:23:10.687686] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.687695] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Self test on startup 00:32:50.759 [2024-12-15 05:23:10.687707] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.013 ms 00:32:50.759 [2024-12-15 05:23:10.687719] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.693974] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.694023] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL dirty state 00:32:50.759 [2024-12-15 05:23:10.694034] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 6.235 ms 00:32:50.759 [2024-12-15 05:23:10.694042] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.694124] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:32:50.759 [2024-12-15 05:23:10.694134] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Finalize initialization 00:32:50.759 [2024-12-15 05:23:10.694144] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.036 ms 00:32:50.759 [2024-12-15 05:23:10.694151] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:32:50.759 [2024-12-15 05:23:10.695339] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL startup', duration = 77.564 ms, result 0 00:32:52.147  [2024-12-15T05:23:13.231Z] Copying: 15/1024 [MB] (15 MBps) [2024-12-15T05:23:14.173Z] Copying: 29/1024 [MB] (14 MBps) [2024-12-15T05:23:15.117Z] Copying: 56/1024 [MB] (26 MBps) [2024-12-15T05:23:16.060Z] Copying: 69/1024 [MB] (13 MBps) [2024-12-15T05:23:17.003Z] Copying: 82/1024 [MB] (13 MBps) [2024-12-15T05:23:17.946Z] Copying: 102/1024 [MB] (19 MBps) [2024-12-15T05:23:19.331Z] Copying: 118/1024 [MB] (16 MBps) [2024-12-15T05:23:19.904Z] Copying: 132/1024 [MB] (13 MBps) [2024-12-15T05:23:21.291Z] Copying: 145/1024 [MB] (13 MBps) [2024-12-15T05:23:22.234Z] Copying: 162/1024 [MB] (16 MBps) [2024-12-15T05:23:23.177Z] Copying: 185/1024 [MB] (22 MBps) [2024-12-15T05:23:24.122Z] Copying: 208/1024 [MB] (23 MBps) [2024-12-15T05:23:25.065Z] Copying: 224/1024 [MB] (15 MBps) [2024-12-15T05:23:26.006Z] Copying: 240/1024 [MB] (15 MBps) [2024-12-15T05:23:26.949Z] Copying: 260/1024 [MB] (19 MBps) [2024-12-15T05:23:27.924Z] Copying: 277/1024 [MB] (17 MBps) [2024-12-15T05:23:29.319Z] Copying: 296/1024 [MB] (18 MBps) [2024-12-15T05:23:30.262Z] Copying: 307/1024 [MB] (11 MBps) [2024-12-15T05:23:31.207Z] Copying: 328/1024 [MB] (20 MBps) [2024-12-15T05:23:32.152Z] Copying: 350/1024 [MB] (22 MBps) [2024-12-15T05:23:33.098Z] Copying: 368/1024 [MB] (18 MBps) [2024-12-15T05:23:34.044Z] Copying: 381/1024 [MB] (12 MBps) [2024-12-15T05:23:34.989Z] Copying: 392/1024 [MB] (11 MBps) [2024-12-15T05:23:35.938Z] Copying: 405/1024 [MB] (12 MBps) [2024-12-15T05:23:37.327Z] Copying: 424/1024 [MB] (19 MBps) [2024-12-15T05:23:37.900Z] Copying: 438/1024 [MB] (14 MBps) [2024-12-15T05:23:39.288Z] Copying: 453/1024 [MB] (14 MBps) [2024-12-15T05:23:40.232Z] Copying: 476/1024 [MB] (23 MBps) [2024-12-15T05:23:41.175Z] Copying: 495/1024 [MB] (18 MBps) [2024-12-15T05:23:42.119Z] Copying: 520/1024 [MB] (24 MBps) [2024-12-15T05:23:43.062Z] Copying: 541/1024 [MB] (21 MBps) [2024-12-15T05:23:44.007Z] Copying: 558/1024 [MB] (16 MBps) [2024-12-15T05:23:44.952Z] Copying: 580/1024 [MB] (22 MBps) [2024-12-15T05:23:45.896Z] Copying: 604/1024 [MB] (23 MBps) [2024-12-15T05:23:47.285Z] Copying: 627/1024 [MB] (23 MBps) [2024-12-15T05:23:48.228Z] Copying: 648/1024 [MB] (20 MBps) [2024-12-15T05:23:49.172Z] Copying: 667/1024 [MB] (19 MBps) [2024-12-15T05:23:50.116Z] Copying: 688/1024 [MB] (20 MBps) [2024-12-15T05:23:51.060Z] Copying: 704/1024 [MB] (15 MBps) [2024-12-15T05:23:52.004Z] Copying: 723/1024 [MB] (19 MBps) [2024-12-15T05:23:52.948Z] Copying: 740/1024 [MB] (16 MBps) [2024-12-15T05:23:53.893Z] Copying: 763/1024 [MB] (22 MBps) [2024-12-15T05:23:55.279Z] Copying: 781/1024 [MB] (18 MBps) [2024-12-15T05:23:56.223Z] Copying: 804/1024 [MB] (22 MBps) [2024-12-15T05:23:57.233Z] Copying: 820/1024 [MB] (16 MBps) [2024-12-15T05:23:58.178Z] Copying: 841/1024 [MB] (21 MBps) [2024-12-15T05:23:59.122Z] Copying: 862/1024 [MB] (20 MBps) [2024-12-15T05:24:00.066Z] Copying: 881/1024 [MB] (19 MBps) [2024-12-15T05:24:01.010Z] Copying: 899/1024 [MB] (17 MBps) [2024-12-15T05:24:01.954Z] Copying: 925/1024 [MB] (25 MBps) [2024-12-15T05:24:02.898Z] Copying: 945/1024 [MB] (20 MBps) [2024-12-15T05:24:04.293Z] Copying: 970/1024 [MB] (24 MBps) [2024-12-15T05:24:05.238Z] Copying: 991/1024 [MB] (21 MBps) [2024-12-15T05:24:05.499Z] Copying: 1014/1024 [MB] (22 MBps) [2024-12-15T05:24:06.073Z] Copying: 1024/1024 [MB] (average 18 MBps)[2024-12-15 05:24:05.761534] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:33:45.933 [2024-12-15 05:24:05.761623] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinit core IO channel 00:33:45.933 [2024-12-15 05:24:05.761643] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.005 ms 00:33:45.933 [2024-12-15 05:24:05.761656] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.933 [2024-12-15 05:24:05.761681] mngt/ftl_mngt_ioch.c: 136:io_channel_destroy_cb: *NOTICE*: [FTL][ftl0] FTL IO channel destroy on app_thread 00:33:45.933 [2024-12-15 05:24:05.762488] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:33:45.933 [2024-12-15 05:24:05.762518] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Unregister IO device 00:33:45.933 [2024-12-15 05:24:05.762532] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.791 ms 00:33:45.933 [2024-12-15 05:24:05.762552] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.933 [2024-12-15 05:24:05.762940] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:33:45.933 [2024-12-15 05:24:05.762951] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Stop core poller 00:33:45.933 [2024-12-15 05:24:05.762961] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.359 ms 00:33:45.933 [2024-12-15 05:24:05.762969] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.933 [2024-12-15 05:24:05.763001] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:33:45.933 [2024-12-15 05:24:05.763011] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Fast persist NV cache metadata 00:33:45.933 [2024-12-15 05:24:05.763020] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.006 ms 00:33:45.933 [2024-12-15 05:24:05.763029] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.933 [2024-12-15 05:24:05.763096] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:33:45.933 [2024-12-15 05:24:05.763108] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Set FTL SHM clean state 00:33:45.933 [2024-12-15 05:24:05.763118] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.028 ms 00:33:45.933 [2024-12-15 05:24:05.763126] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.934 [2024-12-15 05:24:05.763140] ftl_debug.c: 165:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Bands validity: 00:33:45.934 [2024-12-15 05:24:05.763158] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 1: 131072 / 261120 wr_cnt: 1 state: open 00:33:45.934 [2024-12-15 05:24:05.763168] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 2: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763178] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 3: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763185] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 4: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763194] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 5: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763202] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 6: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763210] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 7: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763218] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 8: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763226] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 9: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763235] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 10: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763243] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 11: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763251] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 12: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763260] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 13: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763268] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 14: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763276] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 15: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763288] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 16: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763296] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 17: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763304] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 18: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763312] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 19: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763320] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 20: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763327] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 21: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763335] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 22: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763343] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 23: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763351] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 24: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763359] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 25: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763366] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 26: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763387] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 27: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763395] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 28: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763402] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 29: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763410] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 30: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763418] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 31: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763426] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 32: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763454] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 33: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763462] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 34: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763470] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 35: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763478] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 36: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763486] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 37: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763493] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 38: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763501] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 39: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763508] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 40: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763517] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 41: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763525] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 42: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763534] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 43: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763541] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 44: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763550] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 45: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763558] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 46: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763566] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 47: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763575] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 48: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763584] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 49: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763592] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 50: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763600] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 51: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763609] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 52: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763617] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 53: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763625] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 54: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763633] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 55: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763640] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 56: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763648] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 57: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763655] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 58: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763663] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 59: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763671] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 60: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763678] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 61: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763686] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 62: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763694] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 63: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763701] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 64: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763708] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 65: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763716] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 66: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763723] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 67: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763731] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 68: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763738] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 69: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763746] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 70: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763753] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 71: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763761] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 72: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763769] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 73: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763777] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 74: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763784] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 75: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763792] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 76: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763799] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 77: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763807] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 78: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763815] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 79: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763823] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 80: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763831] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 81: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763839] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 82: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763846] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 83: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763854] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 84: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763862] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 85: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763869] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 86: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763878] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 87: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763886] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 88: 0 / 261120 wr_cnt: 0 state: free 00:33:45.934 [2024-12-15 05:24:05.763895] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 89: 0 / 261120 wr_cnt: 0 state: free 00:33:45.935 [2024-12-15 05:24:05.763902] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 90: 0 / 261120 wr_cnt: 0 state: free 00:33:45.935 [2024-12-15 05:24:05.763910] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 91: 0 / 261120 wr_cnt: 0 state: free 00:33:45.935 [2024-12-15 05:24:05.763917] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 92: 0 / 261120 wr_cnt: 0 state: free 00:33:45.935 [2024-12-15 05:24:05.763925] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 93: 0 / 261120 wr_cnt: 0 state: free 00:33:45.935 [2024-12-15 05:24:05.763932] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 94: 0 / 261120 wr_cnt: 0 state: free 00:33:45.935 [2024-12-15 05:24:05.763940] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 95: 0 / 261120 wr_cnt: 0 state: free 00:33:45.935 [2024-12-15 05:24:05.763948] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 96: 0 / 261120 wr_cnt: 0 state: free 00:33:45.935 [2024-12-15 05:24:05.763956] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 97: 0 / 261120 wr_cnt: 0 state: free 00:33:45.935 [2024-12-15 05:24:05.763964] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 98: 0 / 261120 wr_cnt: 0 state: free 00:33:45.935 [2024-12-15 05:24:05.763972] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 99: 0 / 261120 wr_cnt: 0 state: free 00:33:45.935 [2024-12-15 05:24:05.763980] ftl_debug.c: 167:ftl_dev_dump_bands: *NOTICE*: [FTL][ftl0] Band 100: 0 / 261120 wr_cnt: 0 state: free 00:33:45.935 [2024-12-15 05:24:05.763995] ftl_debug.c: 211:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] 00:33:45.935 [2024-12-15 05:24:05.764003] ftl_debug.c: 212:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] device UUID: 54c73827-c7ab-4552-9a64-af9833daf145 00:33:45.935 [2024-12-15 05:24:05.764016] ftl_debug.c: 213:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total valid LBAs: 131072 00:33:45.935 [2024-12-15 05:24:05.764023] ftl_debug.c: 214:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] total writes: 3616 00:33:45.935 [2024-12-15 05:24:05.764031] ftl_debug.c: 215:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] user writes: 3584 00:33:45.935 [2024-12-15 05:24:05.764041] ftl_debug.c: 216:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] WAF: 1.0089 00:33:45.935 [2024-12-15 05:24:05.764049] ftl_debug.c: 218:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] limits: 00:33:45.935 [2024-12-15 05:24:05.764057] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] crit: 0 00:33:45.935 [2024-12-15 05:24:05.764064] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] high: 0 00:33:45.935 [2024-12-15 05:24:05.764071] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] low: 0 00:33:45.935 [2024-12-15 05:24:05.764077] ftl_debug.c: 220:ftl_dev_dump_stats: *NOTICE*: [FTL][ftl0] start: 0 00:33:45.935 [2024-12-15 05:24:05.764086] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:33:45.935 [2024-12-15 05:24:05.764097] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Dump statistics 00:33:45.935 [2024-12-15 05:24:05.764105] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.947 ms 00:33:45.935 [2024-12-15 05:24:05.764113] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.766765] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:33:45.935 [2024-12-15 05:24:05.766812] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize L2P 00:33:45.935 [2024-12-15 05:24:05.766824] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 2.636 ms 00:33:45.935 [2024-12-15 05:24:05.766833] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.766958] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Action 00:33:45.935 [2024-12-15 05:24:05.766970] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Deinitialize P2L checkpointing 00:33:45.935 [2024-12-15 05:24:05.766978] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.098 ms 00:33:45.935 [2024-12-15 05:24:05.766989] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.774812] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:33:45.935 [2024-12-15 05:24:05.774862] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize reloc 00:33:45.935 [2024-12-15 05:24:05.774872] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:33:45.935 [2024-12-15 05:24:05.774880] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.774953] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:33:45.935 [2024-12-15 05:24:05.774961] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands metadata 00:33:45.935 [2024-12-15 05:24:05.774972] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:33:45.935 [2024-12-15 05:24:05.774980] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.775050] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:33:45.935 [2024-12-15 05:24:05.775066] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize trim map 00:33:45.935 [2024-12-15 05:24:05.775074] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:33:45.935 [2024-12-15 05:24:05.775082] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.775099] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:33:45.935 [2024-12-15 05:24:05.775107] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize valid map 00:33:45.935 [2024-12-15 05:24:05.775115] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:33:45.935 [2024-12-15 05:24:05.775123] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.791057] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:33:45.935 [2024-12-15 05:24:05.791119] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize NV cache 00:33:45.935 [2024-12-15 05:24:05.791130] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:33:45.935 [2024-12-15 05:24:05.791138] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.803408] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:33:45.935 [2024-12-15 05:24:05.803502] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize metadata 00:33:45.935 [2024-12-15 05:24:05.803514] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:33:45.935 [2024-12-15 05:24:05.803523] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.803580] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:33:45.935 [2024-12-15 05:24:05.803591] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize core IO channel 00:33:45.935 [2024-12-15 05:24:05.803607] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:33:45.935 [2024-12-15 05:24:05.803616] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.803653] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:33:45.935 [2024-12-15 05:24:05.803671] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize bands 00:33:45.935 [2024-12-15 05:24:05.803680] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:33:45.935 [2024-12-15 05:24:05.803695] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.803753] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:33:45.935 [2024-12-15 05:24:05.803763] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize memory pools 00:33:45.935 [2024-12-15 05:24:05.803771] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:33:45.935 [2024-12-15 05:24:05.803782] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.803811] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:33:45.935 [2024-12-15 05:24:05.803820] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Initialize superblock 00:33:45.935 [2024-12-15 05:24:05.803828] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:33:45.935 [2024-12-15 05:24:05.803836] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.803878] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:33:45.935 [2024-12-15 05:24:05.803887] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open cache bdev 00:33:45.935 [2024-12-15 05:24:05.803896] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:33:45.935 [2024-12-15 05:24:05.803907] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.803951] mngt/ftl_mngt.c: 427:trace_step: *NOTICE*: [FTL][ftl0] Rollback 00:33:45.935 [2024-12-15 05:24:05.803961] mngt/ftl_mngt.c: 428:trace_step: *NOTICE*: [FTL][ftl0] name: Open base bdev 00:33:45.935 [2024-12-15 05:24:05.803969] mngt/ftl_mngt.c: 430:trace_step: *NOTICE*: [FTL][ftl0] duration: 0.000 ms 00:33:45.935 [2024-12-15 05:24:05.803978] mngt/ftl_mngt.c: 431:trace_step: *NOTICE*: [FTL][ftl0] status: 0 00:33:45.935 [2024-12-15 05:24:05.804117] mngt/ftl_mngt.c: 459:finish_msg: *NOTICE*: [FTL][ftl0] Management process finished, name 'FTL fast shutdown', duration = 42.556 ms, result 0 00:33:45.935 00:33:45.935 00:33:45.935 05:24:06 ftl.ftl_restore_fast -- ftl/restore.sh@82 -- # md5sum -c /home/vagrant/spdk_repo/spdk/test/ftl/testfile.md5 00:33:48.487 /home/vagrant/spdk_repo/spdk/test/ftl/testfile: OK 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- ftl/restore.sh@84 -- # trap - SIGINT SIGTERM EXIT 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- ftl/restore.sh@85 -- # restore_kill 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- ftl/restore.sh@28 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/testfile 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- ftl/restore.sh@29 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/testfile.md5 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- ftl/restore.sh@30 -- # rm -f /home/vagrant/spdk_repo/spdk/test/ftl/config/ftl.json 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- ftl/restore.sh@32 -- # killprocess 96602 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- common/autotest_common.sh@954 -- # '[' -z 96602 ']' 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- common/autotest_common.sh@958 -- # kill -0 96602 00:33:48.487 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 958: kill: (96602) - No such process 00:33:48.487 Process with pid 96602 is not found 00:33:48.487 Remove shared memory files 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- common/autotest_common.sh@981 -- # echo 'Process with pid 96602 is not found' 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- ftl/restore.sh@33 -- # remove_shm 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- ftl/common.sh@204 -- # echo Remove shared memory files 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- ftl/common.sh@205 -- # rm -f rm -f 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- ftl/common.sh@206 -- # rm -f rm -f /dev/hugepages/ftl_54c73827-c7ab-4552-9a64-af9833daf145_band_md /dev/hugepages/ftl_54c73827-c7ab-4552-9a64-af9833daf145_l2p_l1 /dev/hugepages/ftl_54c73827-c7ab-4552-9a64-af9833daf145_l2p_l2 /dev/hugepages/ftl_54c73827-c7ab-4552-9a64-af9833daf145_l2p_l2_ctx /dev/hugepages/ftl_54c73827-c7ab-4552-9a64-af9833daf145_nvc_md /dev/hugepages/ftl_54c73827-c7ab-4552-9a64-af9833daf145_p2l_pool /dev/hugepages/ftl_54c73827-c7ab-4552-9a64-af9833daf145_sb /dev/hugepages/ftl_54c73827-c7ab-4552-9a64-af9833daf145_sb_shm /dev/hugepages/ftl_54c73827-c7ab-4552-9a64-af9833daf145_trim_bitmap /dev/hugepages/ftl_54c73827-c7ab-4552-9a64-af9833daf145_trim_log /dev/hugepages/ftl_54c73827-c7ab-4552-9a64-af9833daf145_trim_md /dev/hugepages/ftl_54c73827-c7ab-4552-9a64-af9833daf145_vmap 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- ftl/common.sh@207 -- # rm -f rm -f 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- ftl/common.sh@208 -- # rm -f rm -f /dev/shm/iscsi 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- ftl/common.sh@209 -- # rm -f rm -f 00:33:48.487 00:33:48.487 real 4m20.125s 00:33:48.487 user 4m8.314s 00:33:48.487 sys 0m11.615s 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- common/autotest_common.sh@1130 -- # xtrace_disable 00:33:48.487 05:24:08 ftl.ftl_restore_fast -- common/autotest_common.sh@10 -- # set +x 00:33:48.487 ************************************ 00:33:48.487 END TEST ftl_restore_fast 00:33:48.487 ************************************ 00:33:48.487 05:24:08 ftl -- ftl/ftl.sh@1 -- # at_ftl_exit 00:33:48.487 05:24:08 ftl -- ftl/ftl.sh@14 -- # killprocess 88089 00:33:48.487 05:24:08 ftl -- common/autotest_common.sh@954 -- # '[' -z 88089 ']' 00:33:48.487 05:24:08 ftl -- common/autotest_common.sh@958 -- # kill -0 88089 00:33:48.487 /home/vagrant/spdk_repo/spdk/test/common/autotest_common.sh: line 958: kill: (88089) - No such process 00:33:48.487 Process with pid 88089 is not found 00:33:48.487 05:24:08 ftl -- common/autotest_common.sh@981 -- # echo 'Process with pid 88089 is not found' 00:33:48.487 05:24:08 ftl -- ftl/ftl.sh@17 -- # [[ -n 0000:00:11.0 ]] 00:33:48.487 05:24:08 ftl -- ftl/ftl.sh@19 -- # spdk_tgt_pid=99263 00:33:48.487 05:24:08 ftl -- ftl/ftl.sh@20 -- # waitforlisten 99263 00:33:48.487 05:24:08 ftl -- common/autotest_common.sh@835 -- # '[' -z 99263 ']' 00:33:48.487 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:48.487 05:24:08 ftl -- common/autotest_common.sh@839 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:48.487 05:24:08 ftl -- common/autotest_common.sh@840 -- # local max_retries=100 00:33:48.487 05:24:08 ftl -- common/autotest_common.sh@842 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:48.487 05:24:08 ftl -- common/autotest_common.sh@844 -- # xtrace_disable 00:33:48.487 05:24:08 ftl -- ftl/ftl.sh@18 -- # /home/vagrant/spdk_repo/spdk/build/bin/spdk_tgt 00:33:48.487 05:24:08 ftl -- common/autotest_common.sh@10 -- # set +x 00:33:48.487 [2024-12-15 05:24:08.527104] Starting SPDK v25.01-pre git sha1 e01cb43b8 / DPDK 23.11.0 initialization... 00:33:48.487 [2024-12-15 05:24:08.527236] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --iova-mode=pa --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid99263 ] 00:33:48.748 [2024-12-15 05:24:08.686826] app.c: 919:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:48.748 [2024-12-15 05:24:08.708406] reactor.c: 995:reactor_run: *NOTICE*: Reactor started on core 0 00:33:49.322 05:24:09 ftl -- common/autotest_common.sh@864 -- # (( i == 0 )) 00:33:49.322 05:24:09 ftl -- common/autotest_common.sh@868 -- # return 0 00:33:49.322 05:24:09 ftl -- ftl/ftl.sh@21 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t PCIe -a 0000:00:11.0 00:33:49.583 nvme0n1 00:33:49.583 05:24:09 ftl -- ftl/ftl.sh@22 -- # clear_lvols 00:33:49.583 05:24:09 ftl -- ftl/common.sh@28 -- # jq -r '.[] | .uuid' 00:33:49.583 05:24:09 ftl -- ftl/common.sh@28 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_get_lvstores 00:33:49.583 05:24:09 ftl -- ftl/common.sh@28 -- # stores=05487b5d-3ef5-4c8e-9336-20d71573bb2d 00:33:49.583 05:24:09 ftl -- ftl/common.sh@29 -- # for lvs in $stores 00:33:49.583 05:24:09 ftl -- ftl/common.sh@30 -- # /home/vagrant/spdk_repo/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -u 05487b5d-3ef5-4c8e-9336-20d71573bb2d 00:33:49.844 05:24:09 ftl -- ftl/ftl.sh@23 -- # killprocess 99263 00:33:49.844 05:24:09 ftl -- common/autotest_common.sh@954 -- # '[' -z 99263 ']' 00:33:49.844 05:24:09 ftl -- common/autotest_common.sh@958 -- # kill -0 99263 00:33:49.844 05:24:09 ftl -- common/autotest_common.sh@959 -- # uname 00:33:49.844 05:24:09 ftl -- common/autotest_common.sh@959 -- # '[' Linux = Linux ']' 00:33:49.844 05:24:09 ftl -- common/autotest_common.sh@960 -- # ps --no-headers -o comm= 99263 00:33:49.844 05:24:09 ftl -- common/autotest_common.sh@960 -- # process_name=reactor_0 00:33:49.844 killing process with pid 99263 00:33:49.844 05:24:09 ftl -- common/autotest_common.sh@964 -- # '[' reactor_0 = sudo ']' 00:33:49.844 05:24:09 ftl -- common/autotest_common.sh@972 -- # echo 'killing process with pid 99263' 00:33:49.844 05:24:09 ftl -- common/autotest_common.sh@973 -- # kill 99263 00:33:49.844 05:24:09 ftl -- common/autotest_common.sh@978 -- # wait 99263 00:33:50.422 05:24:10 ftl -- ftl/ftl.sh@27 -- # /home/vagrant/spdk_repo/spdk/scripts/setup.sh reset 00:33:50.422 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:33:50.422 Waiting for block devices as requested 00:33:50.422 0000:00:11.0 (1b36 0010): uio_pci_generic -> nvme 00:33:50.683 0000:00:10.0 (1b36 0010): uio_pci_generic -> nvme 00:33:50.683 0000:00:12.0 (1b36 0010): uio_pci_generic -> nvme 00:33:50.683 0000:00:13.0 (1b36 0010): uio_pci_generic -> nvme 00:33:55.975 * Events for some block/disk devices (0000:00:13.0) were not caught, they may be missing 00:33:55.975 05:24:15 ftl -- ftl/ftl.sh@28 -- # remove_shm 00:33:55.975 Remove shared memory files 00:33:55.975 05:24:15 ftl -- ftl/common.sh@204 -- # echo Remove shared memory files 00:33:55.975 05:24:15 ftl -- ftl/common.sh@205 -- # rm -f rm -f 00:33:55.975 05:24:15 ftl -- ftl/common.sh@206 -- # rm -f rm -f 00:33:55.975 05:24:15 ftl -- ftl/common.sh@207 -- # rm -f rm -f 00:33:55.975 05:24:15 ftl -- ftl/common.sh@208 -- # rm -f rm -f /dev/shm/iscsi 00:33:55.975 05:24:15 ftl -- ftl/common.sh@209 -- # rm -f rm -f 00:33:55.975 00:33:55.975 real 16m49.700s 00:33:55.975 user 18m49.515s 00:33:55.975 sys 1m20.645s 00:33:55.975 05:24:15 ftl -- common/autotest_common.sh@1130 -- # xtrace_disable 00:33:55.975 ************************************ 00:33:55.975 END TEST ftl 00:33:55.975 ************************************ 00:33:55.975 05:24:15 ftl -- common/autotest_common.sh@10 -- # set +x 00:33:55.975 05:24:15 -- spdk/autotest.sh@346 -- # '[' 0 -eq 1 ']' 00:33:55.975 05:24:15 -- spdk/autotest.sh@350 -- # '[' 0 -eq 1 ']' 00:33:55.975 05:24:15 -- spdk/autotest.sh@355 -- # '[' 0 -eq 1 ']' 00:33:55.975 05:24:15 -- spdk/autotest.sh@359 -- # '[' 0 -eq 1 ']' 00:33:55.975 05:24:15 -- spdk/autotest.sh@366 -- # [[ 0 -eq 1 ]] 00:33:55.975 05:24:15 -- spdk/autotest.sh@370 -- # [[ 0 -eq 1 ]] 00:33:55.975 05:24:15 -- spdk/autotest.sh@374 -- # [[ 0 -eq 1 ]] 00:33:55.975 05:24:15 -- spdk/autotest.sh@378 -- # [[ '' -eq 1 ]] 00:33:55.975 05:24:15 -- spdk/autotest.sh@385 -- # trap - SIGINT SIGTERM EXIT 00:33:55.975 05:24:15 -- spdk/autotest.sh@387 -- # timing_enter post_cleanup 00:33:55.975 05:24:15 -- common/autotest_common.sh@726 -- # xtrace_disable 00:33:55.975 05:24:15 -- common/autotest_common.sh@10 -- # set +x 00:33:55.975 05:24:15 -- spdk/autotest.sh@388 -- # autotest_cleanup 00:33:55.975 05:24:15 -- common/autotest_common.sh@1396 -- # local autotest_es=0 00:33:55.975 05:24:15 -- common/autotest_common.sh@1397 -- # xtrace_disable 00:33:55.975 05:24:15 -- common/autotest_common.sh@10 -- # set +x 00:33:57.362 INFO: APP EXITING 00:33:57.362 INFO: killing all VMs 00:33:57.362 INFO: killing vhost app 00:33:57.362 INFO: EXIT DONE 00:33:57.624 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:33:57.885 0000:00:11.0 (1b36 0010): Already using the nvme driver 00:33:58.147 0000:00:10.0 (1b36 0010): Already using the nvme driver 00:33:58.147 0000:00:12.0 (1b36 0010): Already using the nvme driver 00:33:58.147 0000:00:13.0 (1b36 0010): Already using the nvme driver 00:33:58.408 0000:00:03.0 (1af4 1001): Active devices: mount@vda:vda2,mount@vda:vda3,mount@vda:vda5, so not binding PCI dev 00:33:58.670 Cleaning 00:33:58.670 Removing: /var/run/dpdk/spdk0/config 00:33:58.670 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:33:58.670 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:33:58.670 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:33:58.670 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:33:58.670 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:33:58.670 Removing: /var/run/dpdk/spdk0/hugepage_info 00:33:58.670 Removing: /var/run/dpdk/spdk0 00:33:58.670 Removing: /var/run/dpdk/spdk_pid71048 00:33:58.670 Removing: /var/run/dpdk/spdk_pid71206 00:33:58.670 Removing: /var/run/dpdk/spdk_pid71408 00:33:58.670 Removing: /var/run/dpdk/spdk_pid71495 00:33:58.670 Removing: /var/run/dpdk/spdk_pid71518 00:33:58.670 Removing: /var/run/dpdk/spdk_pid71630 00:33:58.931 Removing: /var/run/dpdk/spdk_pid71642 00:33:58.931 Removing: /var/run/dpdk/spdk_pid71825 00:33:58.931 Removing: /var/run/dpdk/spdk_pid71898 00:33:58.931 Removing: /var/run/dpdk/spdk_pid71978 00:33:58.931 Removing: /var/run/dpdk/spdk_pid72072 00:33:58.931 Removing: /var/run/dpdk/spdk_pid72158 00:33:58.931 Removing: /var/run/dpdk/spdk_pid72192 00:33:58.931 Removing: /var/run/dpdk/spdk_pid72229 00:33:58.931 Removing: /var/run/dpdk/spdk_pid72299 00:33:58.931 Removing: /var/run/dpdk/spdk_pid72387 00:33:58.931 Removing: /var/run/dpdk/spdk_pid72812 00:33:58.931 Removing: /var/run/dpdk/spdk_pid72854 00:33:58.931 Removing: /var/run/dpdk/spdk_pid72900 00:33:58.931 Removing: /var/run/dpdk/spdk_pid72911 00:33:58.931 Removing: /var/run/dpdk/spdk_pid72969 00:33:58.931 Removing: /var/run/dpdk/spdk_pid72985 00:33:58.931 Removing: /var/run/dpdk/spdk_pid73043 00:33:58.931 Removing: /var/run/dpdk/spdk_pid73059 00:33:58.931 Removing: /var/run/dpdk/spdk_pid73101 00:33:58.931 Removing: /var/run/dpdk/spdk_pid73119 00:33:58.931 Removing: /var/run/dpdk/spdk_pid73161 00:33:58.931 Removing: /var/run/dpdk/spdk_pid73179 00:33:58.931 Removing: /var/run/dpdk/spdk_pid73306 00:33:58.931 Removing: /var/run/dpdk/spdk_pid73337 00:33:58.931 Removing: /var/run/dpdk/spdk_pid73426 00:33:58.931 Removing: /var/run/dpdk/spdk_pid73587 00:33:58.931 Removing: /var/run/dpdk/spdk_pid73649 00:33:58.931 Removing: /var/run/dpdk/spdk_pid73680 00:33:58.931 Removing: /var/run/dpdk/spdk_pid74101 00:33:58.931 Removing: /var/run/dpdk/spdk_pid74194 00:33:58.931 Removing: /var/run/dpdk/spdk_pid74299 00:33:58.931 Removing: /var/run/dpdk/spdk_pid74340 00:33:58.931 Removing: /var/run/dpdk/spdk_pid74361 00:33:58.931 Removing: /var/run/dpdk/spdk_pid74434 00:33:58.931 Removing: /var/run/dpdk/spdk_pid75052 00:33:58.931 Removing: /var/run/dpdk/spdk_pid75083 00:33:58.931 Removing: /var/run/dpdk/spdk_pid75539 00:33:58.931 Removing: /var/run/dpdk/spdk_pid75627 00:33:58.931 Removing: /var/run/dpdk/spdk_pid75743 00:33:58.931 Removing: /var/run/dpdk/spdk_pid75785 00:33:58.931 Removing: /var/run/dpdk/spdk_pid75805 00:33:58.931 Removing: /var/run/dpdk/spdk_pid75836 00:33:58.931 Removing: /var/run/dpdk/spdk_pid77648 00:33:58.931 Removing: /var/run/dpdk/spdk_pid77771 00:33:58.931 Removing: /var/run/dpdk/spdk_pid77781 00:33:58.931 Removing: /var/run/dpdk/spdk_pid77793 00:33:58.931 Removing: /var/run/dpdk/spdk_pid77832 00:33:58.931 Removing: /var/run/dpdk/spdk_pid77836 00:33:58.931 Removing: /var/run/dpdk/spdk_pid77848 00:33:58.931 Removing: /var/run/dpdk/spdk_pid77893 00:33:58.931 Removing: /var/run/dpdk/spdk_pid77897 00:33:58.931 Removing: /var/run/dpdk/spdk_pid77909 00:33:58.931 Removing: /var/run/dpdk/spdk_pid77954 00:33:58.931 Removing: /var/run/dpdk/spdk_pid77958 00:33:58.931 Removing: /var/run/dpdk/spdk_pid77970 00:33:58.931 Removing: /var/run/dpdk/spdk_pid79354 00:33:58.931 Removing: /var/run/dpdk/spdk_pid79440 00:33:58.931 Removing: /var/run/dpdk/spdk_pid80834 00:33:58.931 Removing: /var/run/dpdk/spdk_pid82571 00:33:58.931 Removing: /var/run/dpdk/spdk_pid82629 00:33:58.931 Removing: /var/run/dpdk/spdk_pid82695 00:33:58.931 Removing: /var/run/dpdk/spdk_pid82799 00:33:58.931 Removing: /var/run/dpdk/spdk_pid82879 00:33:58.931 Removing: /var/run/dpdk/spdk_pid82965 00:33:58.931 Removing: /var/run/dpdk/spdk_pid83023 00:33:58.931 Removing: /var/run/dpdk/spdk_pid83087 00:33:58.931 Removing: /var/run/dpdk/spdk_pid83191 00:33:58.931 Removing: /var/run/dpdk/spdk_pid83272 00:33:58.931 Removing: /var/run/dpdk/spdk_pid83364 00:33:58.931 Removing: /var/run/dpdk/spdk_pid83416 00:33:58.931 Removing: /var/run/dpdk/spdk_pid83487 00:33:58.931 Removing: /var/run/dpdk/spdk_pid83580 00:33:58.931 Removing: /var/run/dpdk/spdk_pid83666 00:33:58.931 Removing: /var/run/dpdk/spdk_pid83751 00:33:58.932 Removing: /var/run/dpdk/spdk_pid83808 00:33:58.932 Removing: /var/run/dpdk/spdk_pid83878 00:33:58.932 Removing: /var/run/dpdk/spdk_pid83971 00:33:58.932 Removing: /var/run/dpdk/spdk_pid84056 00:33:58.932 Removing: /var/run/dpdk/spdk_pid84142 00:33:58.932 Removing: /var/run/dpdk/spdk_pid84200 00:33:58.932 Removing: /var/run/dpdk/spdk_pid84272 00:33:58.932 Removing: /var/run/dpdk/spdk_pid84335 00:33:58.932 Removing: /var/run/dpdk/spdk_pid84406 00:33:58.932 Removing: /var/run/dpdk/spdk_pid84498 00:33:58.932 Removing: /var/run/dpdk/spdk_pid84578 00:33:58.932 Removing: /var/run/dpdk/spdk_pid84667 00:33:59.193 Removing: /var/run/dpdk/spdk_pid84719 00:33:59.193 Removing: /var/run/dpdk/spdk_pid84788 00:33:59.193 Removing: /var/run/dpdk/spdk_pid84851 00:33:59.193 Removing: /var/run/dpdk/spdk_pid84920 00:33:59.193 Removing: /var/run/dpdk/spdk_pid85012 00:33:59.193 Removing: /var/run/dpdk/spdk_pid85097 00:33:59.193 Removing: /var/run/dpdk/spdk_pid85230 00:33:59.193 Removing: /var/run/dpdk/spdk_pid85503 00:33:59.193 Removing: /var/run/dpdk/spdk_pid85523 00:33:59.193 Removing: /var/run/dpdk/spdk_pid85964 00:33:59.193 Removing: /var/run/dpdk/spdk_pid86145 00:33:59.193 Removing: /var/run/dpdk/spdk_pid86235 00:33:59.193 Removing: /var/run/dpdk/spdk_pid86335 00:33:59.193 Removing: /var/run/dpdk/spdk_pid86372 00:33:59.193 Removing: /var/run/dpdk/spdk_pid86398 00:33:59.193 Removing: /var/run/dpdk/spdk_pid86695 00:33:59.193 Removing: /var/run/dpdk/spdk_pid86732 00:33:59.193 Removing: /var/run/dpdk/spdk_pid86780 00:33:59.193 Removing: /var/run/dpdk/spdk_pid87149 00:33:59.193 Removing: /var/run/dpdk/spdk_pid87286 00:33:59.193 Removing: /var/run/dpdk/spdk_pid88089 00:33:59.193 Removing: /var/run/dpdk/spdk_pid88199 00:33:59.193 Removing: /var/run/dpdk/spdk_pid88353 00:33:59.193 Removing: /var/run/dpdk/spdk_pid88461 00:33:59.193 Removing: /var/run/dpdk/spdk_pid88764 00:33:59.193 Removing: /var/run/dpdk/spdk_pid89029 00:33:59.193 Removing: /var/run/dpdk/spdk_pid89370 00:33:59.193 Removing: /var/run/dpdk/spdk_pid89530 00:33:59.193 Removing: /var/run/dpdk/spdk_pid89728 00:33:59.193 Removing: /var/run/dpdk/spdk_pid89770 00:33:59.193 Removing: /var/run/dpdk/spdk_pid89957 00:33:59.193 Removing: /var/run/dpdk/spdk_pid89980 00:33:59.193 Removing: /var/run/dpdk/spdk_pid90016 00:33:59.193 Removing: /var/run/dpdk/spdk_pid90280 00:33:59.193 Removing: /var/run/dpdk/spdk_pid90499 00:33:59.193 Removing: /var/run/dpdk/spdk_pid90999 00:33:59.193 Removing: /var/run/dpdk/spdk_pid91743 00:33:59.193 Removing: /var/run/dpdk/spdk_pid92366 00:33:59.193 Removing: /var/run/dpdk/spdk_pid93129 00:33:59.193 Removing: /var/run/dpdk/spdk_pid93262 00:33:59.193 Removing: /var/run/dpdk/spdk_pid93343 00:33:59.193 Removing: /var/run/dpdk/spdk_pid93823 00:33:59.193 Removing: /var/run/dpdk/spdk_pid93876 00:33:59.193 Removing: /var/run/dpdk/spdk_pid94463 00:33:59.193 Removing: /var/run/dpdk/spdk_pid94938 00:33:59.193 Removing: /var/run/dpdk/spdk_pid95690 00:33:59.193 Removing: /var/run/dpdk/spdk_pid95808 00:33:59.193 Removing: /var/run/dpdk/spdk_pid95840 00:33:59.193 Removing: /var/run/dpdk/spdk_pid95898 00:33:59.193 Removing: /var/run/dpdk/spdk_pid95944 00:33:59.193 Removing: /var/run/dpdk/spdk_pid95991 00:33:59.193 Removing: /var/run/dpdk/spdk_pid96182 00:33:59.193 Removing: /var/run/dpdk/spdk_pid96251 00:33:59.193 Removing: /var/run/dpdk/spdk_pid96313 00:33:59.193 Removing: /var/run/dpdk/spdk_pid96376 00:33:59.193 Removing: /var/run/dpdk/spdk_pid96416 00:33:59.193 Removing: /var/run/dpdk/spdk_pid96468 00:33:59.193 Removing: /var/run/dpdk/spdk_pid96602 00:33:59.193 Removing: /var/run/dpdk/spdk_pid96805 00:33:59.193 Removing: /var/run/dpdk/spdk_pid97412 00:33:59.193 Removing: /var/run/dpdk/spdk_pid98106 00:33:59.193 Removing: /var/run/dpdk/spdk_pid98659 00:33:59.193 Removing: /var/run/dpdk/spdk_pid99263 00:33:59.193 Clean 00:33:59.193 05:24:19 -- common/autotest_common.sh@1453 -- # return 0 00:33:59.193 05:24:19 -- spdk/autotest.sh@389 -- # timing_exit post_cleanup 00:33:59.193 05:24:19 -- common/autotest_common.sh@732 -- # xtrace_disable 00:33:59.193 05:24:19 -- common/autotest_common.sh@10 -- # set +x 00:33:59.455 05:24:19 -- spdk/autotest.sh@391 -- # timing_exit autotest 00:33:59.455 05:24:19 -- common/autotest_common.sh@732 -- # xtrace_disable 00:33:59.455 05:24:19 -- common/autotest_common.sh@10 -- # set +x 00:33:59.455 05:24:19 -- spdk/autotest.sh@392 -- # chmod a+r /home/vagrant/spdk_repo/spdk/../output/timing.txt 00:33:59.455 05:24:19 -- spdk/autotest.sh@394 -- # [[ -f /home/vagrant/spdk_repo/spdk/../output/udev.log ]] 00:33:59.455 05:24:19 -- spdk/autotest.sh@394 -- # rm -f /home/vagrant/spdk_repo/spdk/../output/udev.log 00:33:59.455 05:24:19 -- spdk/autotest.sh@396 -- # [[ y == y ]] 00:33:59.455 05:24:19 -- spdk/autotest.sh@398 -- # hostname 00:33:59.455 05:24:19 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 -q -c --no-external -d /home/vagrant/spdk_repo/spdk -t fedora39-cloud-1721788873-2326 -o /home/vagrant/spdk_repo/spdk/../output/cov_test.info 00:33:59.715 geninfo: WARNING: invalid characters removed from testname! 00:34:26.301 05:24:44 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 -q -a /home/vagrant/spdk_repo/spdk/../output/cov_base.info -a /home/vagrant/spdk_repo/spdk/../output/cov_test.info -o /home/vagrant/spdk_repo/spdk/../output/cov_total.info 00:34:28.852 05:24:48 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 -q -r /home/vagrant/spdk_repo/spdk/../output/cov_total.info '*/dpdk/*' -o /home/vagrant/spdk_repo/spdk/../output/cov_total.info 00:34:31.472 05:24:51 -- spdk/autotest.sh@404 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 -q -r /home/vagrant/spdk_repo/spdk/../output/cov_total.info --ignore-errors unused,unused '/usr/*' -o /home/vagrant/spdk_repo/spdk/../output/cov_total.info 00:34:34.010 05:24:53 -- spdk/autotest.sh@405 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 -q -r /home/vagrant/spdk_repo/spdk/../output/cov_total.info '*/examples/vmd/*' -o /home/vagrant/spdk_repo/spdk/../output/cov_total.info 00:34:36.554 05:24:56 -- spdk/autotest.sh@406 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 -q -r /home/vagrant/spdk_repo/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /home/vagrant/spdk_repo/spdk/../output/cov_total.info 00:34:39.101 05:24:59 -- spdk/autotest.sh@407 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --rc geninfo_unexecuted_blocks=1 -q -r /home/vagrant/spdk_repo/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /home/vagrant/spdk_repo/spdk/../output/cov_total.info 00:34:42.416 05:25:01 -- spdk/autotest.sh@408 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:34:42.416 05:25:01 -- spdk/autorun.sh@1 -- $ timing_finish 00:34:42.416 05:25:01 -- common/autotest_common.sh@738 -- $ [[ -e /home/vagrant/spdk_repo/spdk/../output/timing.txt ]] 00:34:42.416 05:25:01 -- common/autotest_common.sh@740 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:34:42.416 05:25:01 -- common/autotest_common.sh@741 -- $ [[ -x /usr/local/FlameGraph/flamegraph.pl ]] 00:34:42.416 05:25:01 -- common/autotest_common.sh@744 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /home/vagrant/spdk_repo/spdk/../output/timing.txt 00:34:42.416 + [[ -n 5757 ]] 00:34:42.416 + sudo kill 5757 00:34:42.426 [Pipeline] } 00:34:42.440 [Pipeline] // timeout 00:34:42.444 [Pipeline] } 00:34:42.457 [Pipeline] // stage 00:34:42.462 [Pipeline] } 00:34:42.475 [Pipeline] // catchError 00:34:42.483 [Pipeline] stage 00:34:42.484 [Pipeline] { (Stop VM) 00:34:42.495 [Pipeline] sh 00:34:42.779 + vagrant halt 00:34:45.325 ==> default: Halting domain... 00:34:51.926 [Pipeline] sh 00:34:52.209 + vagrant destroy -f 00:34:54.751 ==> default: Removing domain... 00:34:55.332 [Pipeline] sh 00:34:55.612 + mv output /var/jenkins/workspace/nvme-vg-autotest/output 00:34:55.621 [Pipeline] } 00:34:55.635 [Pipeline] // stage 00:34:55.639 [Pipeline] } 00:34:55.650 [Pipeline] // dir 00:34:55.654 [Pipeline] } 00:34:55.666 [Pipeline] // wrap 00:34:55.670 [Pipeline] } 00:34:55.686 [Pipeline] // catchError 00:34:55.693 [Pipeline] stage 00:34:55.695 [Pipeline] { (Epilogue) 00:34:55.703 [Pipeline] sh 00:34:55.982 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:35:01.277 [Pipeline] catchError 00:35:01.279 [Pipeline] { 00:35:01.292 [Pipeline] sh 00:35:01.579 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:35:01.579 Artifacts sizes are good 00:35:01.610 [Pipeline] } 00:35:01.624 [Pipeline] // catchError 00:35:01.635 [Pipeline] archiveArtifacts 00:35:01.653 Archiving artifacts 00:35:01.761 [Pipeline] cleanWs 00:35:01.774 [WS-CLEANUP] Deleting project workspace... 00:35:01.774 [WS-CLEANUP] Deferred wipeout is used... 00:35:01.781 [WS-CLEANUP] done 00:35:01.783 [Pipeline] } 00:35:01.798 [Pipeline] // stage 00:35:01.804 [Pipeline] } 00:35:01.817 [Pipeline] // node 00:35:01.823 [Pipeline] End of Pipeline 00:35:01.870 Finished: SUCCESS